SEO or Search Engine Optimization is a process of improving the ranking position of a website for specific keywords. SEO has changed a lot since the early days of the internet; the algorithms used by search engines like Google and Bing to rank websites in search results are constantly being updated and as a result, SEO is an evolving discipline. In order to maintain consistent rankings for keywords related to your website, it’s important to understand the fundamentals behind of search engine optimization and how websites are ranked.


Search engines are ubiquitous in today’s hyper-connected world but how they work is still a mystery to the majority of the people who use them. At their core, search engines work by crawling websites using a spider (or bot) and indexing them in their database. When a user enters a query, the search engine will display web pages and content that the index has ranked as the most relevant according to the keywords in the user’s search query. Search engines are constantly updating and tweaking their algorithms and how they rank websites; their only purpose is to provide users with relevant content. When you are optimizing your website for search engines, you’re number one goal should be to provide users with a relevant content and a pleasant experience.



A search engine query is when a user enters a specific keyword or phrase into a search engine. By their definition, a query happens when someone is looking for information. Broadly speaking, there are three types of search queries. Informational Queries – When a user is looking know more about a particular topic. Navigation Queries – When a user is looking for information relating to a geographic location such as looking for directions. Transactional Queries – When a user is looking to do a specific action such a buying a product.


A primary function of search engines is to use an automated program called a “spider” — also known as a ‘bot’ or ‘crawler’ — that travels through the internet and gathers information about a website or web page. Crawlers move through the website from meaning they will scrape (or read) the information on one page and then move to another page based on the information it finds on the first page. Website navigation and hierarchy is important for bots but also for users too! Crawling isn’t just the automated process of scraping website data, search engine bots are looking to see if a website is easy for users to navigate.


After the “spider” has finished Crawling the website, the index next step is to the results in the search engine’s database. The crawler will analyze the results it finds and then set sort the information based on the site content. The index information is then stored in a huge database call an. The index contains all the information that a search engine will use when someone makes a search query.

The ranking

Ranking is the final step and arguably the most important for digital marketers and SEOs. Now that the bot has crawled a website and stored the information in its index, its rank will be the page according to a number of different search signals. Search signals are the factors that a search engine take into an  account when they are ranking a website.  There are over two hundred known search signals that are used when a search engine determines the position of a website for any signals given search query.


It’s no secret that search engines are some of the biggest traffic drivers for a website. For individuals or business selling their products or services online, SEO is crucial to getting found by potential customers and staying competitive. The process of ranking a website high in search-engine-results-pages (SERPs) and optimizing a website for search engines is a constantly changing discipline with new guidelines and best practices being introduced regularly. As the way we use the internet changes, so does SEO. Search engines like Google and Bing alter their algorithms and processes in order to provide the best (most relevant) answers to their users.