This edited extract is from Digital and Social Media Marketing: A Results-Driven Approach edited by Aleksej Heinze, Gordon Fletcher, Ana Cruz, Alex Fenton ©2024 and is reproduced with permission from Routledge. The extract below was taken from the chapter Using Search Engine Optimisation to Build Trust co-authored with Aleksej Heinze, Senior Professor at KEDGE Business School, France.
The key challenge for SEO is that good rankings in SERPs are almost entirely based on each search engine’s private algorithm for identifying high-quality content and results, which is a long-term activity.
The initial formula of PageRank (Page et al. 1999) used by Google, which used links pointing to a page to rank its importance, has evolved significantly and is no longer publicly available.
All search engines regularly update their algorithms to identify high-quality, relevant content to a particular search query. Google implements around 500 – 600 changes to its algorithm each year (Gillespie 2019).
These are product updates, similar to Windows updates. Most of these changes are minor with little impact, but a few critical core updates each year will require careful review on the majority of websites since they can result in major SERP changes.
Search engines are using artificial intelligence to improve their technology to enable them to identify high-quality, relevant content and are constantly testing new ways to present users with relevant content.
The arrival of ChatGPT by Open AI in 2022 presents a rival type of offering that has shaken the foundations of the traditional search engine business model (Poola 2023).
In such a dynamic environment, it is important to keep up to date with algorithm changes.
This can be done by following the Google Search Status dashboard (Google) and SEO-related blog posts and monitoring, including the MOZ algorithm change calendar (Moz).
How Search Engines Work
In essence, a search engine’s crawler, spider, robot or ‘bot’ discovers web page links, and then internally determines if there is value in analysing the links.
Then, the bot automatically retrieves the content behind each link (including more links). This process is called crawling.
Bots may then add the discovered pages to the search engines’s index to be retrieved when a user searches for something.
The ranking order in which the links appear in SERPs is calculated by the engine’s algorithm, which examines the relevance of the content to the query.
This relevance is determined by a combination of over 200 factors such as the visible text, keywords, the position and relationship of words, links, synonyms and semantic entities (Garg 2022).
When the user of a search engine types in a query, they are presented with a list of links to content that the engine calculates will satisfy the intent of the query – the list of results is the SERP.
Typically, the list of results that are shown in SERPs includes a mix of paid-for and organic results. Each link includes a short URL, title and description, as well as other options such as thumbnail images, videos and other related internal site links.
Search engines are constantly making changes to SERPs to improve the experience for those searching. For example, Bing includes Bing Chat, allowing responses to be offered by their AI bot.
Google introduced a knowledge graph or a summary answer box, found underneath the search box on the right of the organic search results.
The Bing Chat as well as Google knowledge graph provide a direct and relevant summary response to a query without the need for a further click to the source page (and retaining the user at the search engine).
This offering leads to so-called 0-click searches, which cannot be tracked in the data relating to a digital presence and are only seen in data that relates content visibility to SERPs.
Some Google SERP snippets can also appear as a knowledge graph (Figure 12.8) or a search snippet (Figure 12.9).
Figure 12.8: Google SERP for “KEDGE Business School” including a knowledge graph on the right-hand side of the page.
Figure 12.9: Search snippet for Jean Reno.
The volatility of the SERPs can be evidenced by the varying results produced by the same search in different locations.
The listing for the US market (Figure 12.10) and carousel for the European market (Figure 12.11) for “best DJs” shows that geolocation increasingly comes into play in the page ranking of SERPs.
Personalisation is also relevant. For example, when a user is logged into a Google product, their browser history influences the organic SERPs. SERPs change depending on what terms are used.