As daily business worldwide is being conducted online, effective search engine optimization (SEO) is becoming a significant factor in a brand’s success. Even the most traditional businesses can no longer afford to ignore this crucial way of reaching customers in the critical moments of their buying journey. In addition to improving a website’s overall visibility and searchability, it can help to build trust, credibility, and engagement.
There is no single tried-and-true formula for SEO success. The best practices to follow are constantly evolving and depend on various factors. Nevertheless, there are some key signals that Google has long relied upon in assessing websites and determining which ones to reward with the coveted top spots in search queries.
An In-Depth Dive Into Google’s Search Ranking Systems
Google uses a complex web of automatic ranking systems to assess numerous signals and factors related to the hundreds of billions of web pages on the internet. These systems enable them to give searchers the most valuable and relevant results at lightning speed. Here is a look at the most critical signals in Google’s search ranking systems.
Bidirectional Encoder Representations From Transformers (BERT)
Bidirectional Encoder Representations from Transformers, or BERT, was launched in 2019 by Google and represented a significant evolution in natural language understanding. This model enables Google to understand how combinations of words convey different intents and meanings.
Instead of searching for context that matches the individual words used in a search, BERT has a better comprehension of how word combinations are used to express more complicated ideas. Because it considers the words in a particular sequence and how they relate to one another, essential words will not be omitted from queries simply because they are small.
Google’s official blog uses the example of searching for the query “can you get medicine from someone’s pharmacy.” Before implementing BERT, this would have likely yielded results explaining how to fill a prescription in general, as the preposition “for” would have been overlooked. With BERT, however, Google can comprehend that the searcher wants to know if they are allowed to pick up medicine for somebody else at a pharmacy and can therefore return more helpful search results.
BERT is now heavily relied upon by Google thanks to its superior ability to rank and retrieve pages. It is a significant factor in nearly all English queries thanks to its ability to organize documents rapidly in relevance. However, they quickly point out that BERT works with other systems to generate the highest-quality results possible for a given query.
Many search queries can return millions of matching web pages. However, it is common for a significant number of these pages to be similar. Google has found a way to counter this by displaying only the most relevant results thanks to its deduplication systems that aim to avoid unhelpful duplication.
However, users can still see the results emitted by these systems if they wish. For example, they reach the end of the available results for a query. In that case, they may encounter a message informing them that some entries had been omitted from the results shown because of their similarity to the ones already displayed. There is a link in this message that users can click on if they wish to see the results.
Deduplication is also used with Google’s featured snippets. When Google features a web page listing as a featured snippet, that same page will not be repeated later within the first results page to help make it easier for users to find the information they seek.
Exact Match Domain System
Google’s ranking system also considers the words in a website’s domain name to determine relevancy. However, they recognize the importance of not giving too much credit to content hosted on a domain name that may have been chosen as a perfect match for a specific query to exploit this system.
If an individual or business creates a domain name such as “top-places-to-buy-jewelry.com” to push their content higher in the rankings, it will not automatically be ranked highly for that query.
Google has implemented several freshness systems aimed at displaying fresher content where appropriate. For example, when a person searches for a movie just released, they will likely look for a recent review instead of articles about the beginning of its production.
This may also apply to current events. For example, a search on the term “hurricane” might show information about preparing for these storms under normal circumstances. However, in cases where a hurricane has recently made landfall, fresher content, like news articles about the storm’s path and the damage it caused, might appear higher in the results.
Helpful Content System
Google’s helpful content system aims to help searchers find valuable and original content written by people rather than machines with the genuine intent to help others instead of simply appealing to search algorithms to attract more traffic.
It works by creating a signal that Google’s automated ranking systems use to reward content that gives readers a satisfying experience and keeps content that does not meet expectations from appearing too high in the results. Their systems can automatically identify content with low value or are not helpful to people searching on the terms in question.
All of the content on sites with high amounts of content deemed unhelpful will be less likely to rank well in search results, as long as there is other content on the internet that Google believes is better to display to searchers. In other words, an entire site can need more helpful content. This also means that removing unhelpful content from a site can help to boost the rankings of all the content on that site. In addition, this signal is weighted, meaning areas with a high proportion of unhelpful content could see a more potent effect.
Google reports that its classifier process is fully automated with a machine-learning model that works in every language. However, they emphasize that it is just one of many critical signals they use to rank content. Therefore, if other signals identify a particular web page as highly relevant to a query and helpful to readers, it may still rank well despite being on a site that has been classified as containing unhelpful content in the past.
Google supplies a list of questions web admins can keep in mind to ensure their web content is helpful and follows a people-first approach. For example, they suggest providing content that contains original research, analysis, or information and offers a comprehensive look at the topic with great insight. Content people want to bookmark, recommend or share with others is likely to rank higher. An authority or enthusiast writes precise sourcing, and ensuring content is also helpful.
Link Analysis Systems & PageRank
PageRank has long been a key signal used by Google and was once the main concern for SEO. It has evolved over the years but remains a core part of determining what pages are about and which could be the most useful to searchers of a particular query.
It considers how pages link to one another in determining their content. In the past, the number of links to a page from other sites carried much weight, but PageRank has evolved and now considers various factors. Poor-quality sites cannot rank higher by engaging in black-hat activities like link farming.
Neural matching is a type of artificial intelligence system employed by Google to understand the various ways a concept might be represented in different pages and queries so it can produce better matches.
Not all mentions of certain concepts are straightforward and clear, meaning that relying simply on keywords is not enough to portray the true meaning of content accurately. Google’s blog cites the example of the search query “insights how to manage a green.” This would be difficult for most people to understand, but neural matching can make better sense of this type of query. It can tell that the person who typed in this search query is seeking management tips based on color-based personality categories. This understanding helps Google find relevant content in its enormous and constantly evolving index of information.
Original Content Systems
Most people have encountered recycled information on the internet. Google has systems that help ensure that original content is displayed more prominently than content that simply cites this original content.
For example, it is not uncommon for lower-budget news websites to post rehashes of what more prominent outlets have published, as not all journalists will have the opportunity to interview newsmakers. Google aims to reward original reporting with higher search results, given the effort involved in carrying out journalistic investigations and tracking down sources. These pages might remain in an apparent position in search results for longer, enabling searchers to see the original reporting alongside more recent articles about the story or topic.
There is a canonical markup that creators can use to help Google determine the primary page in instances where a page has been duplicated across several other places. For example, on sites that use a page with a desktop and a mobile version, Google would view the two pages as duplicates of the same page. One URL will be considered canonical by Google and crawled, while the others will be deemed copies and not crawled as often. Webmasters can tell Google which is canonical, giving them more control over which URL people will see in search results and simplifying tracking metrics.
Removal-Based Demotion Systems
Google has enacted policies that permit certain types of content to be removed from its search results. Once Google processes many removals for a specific site, they reassess its position in the rankings.
There are two main types of removals: legal removals and personal information removals.
When Google receives a significant number of copyright removal requests for a given site, it will then demote other content from the same place within the search results. This means that if the site happens to contain other infringing content that has not yet been reported, searchers will be less likely to find it. Google uses the same procedures to respond to complaints of counterfeit goods, defamation, and court-ordered removals.
Personal Information Removals
In cases where Google processes many individual information removals on a site deemed to use exploitative removal practices, other content from the exact location will be demoted within their search results. If they detect the same pattern of behavior with other sites, content on those sites will also be subject to the same types of demotions. Sites that have a high volume of doxing removals may be similarly demoted. Google also has implemented automatic protections that stop nonconsensual explicit personal images from achieving high ranks in search queries of names.
Page Experience System
Searchers vastly prefer to visit web pages that offer a great user experience. In response, Google has enacted a page experience system that looks at various criteria to assess how user-friendly a particular page is.
Core Web Vitals
Core web vitals signal whether a specific page offers a positive user experience. Factors such as loading, visual stability, and interactivity are considered. To make this assessment, Google looks at the Largest Contentful Paint, or LCP, which measures loading performance. They also believe the Cumulative Layout Shift, or CLS, measures the site’s visual stability, along with a measure of interactivity known as First Input Delay, or FID.
Google favors sites served over HTTPS, which guarantees that a site’s connection is secure.
Lack Of Intrusive Interstitials
Preference is given to websites whose content can be accessed easily by users. This means that they lack intrusive interstitials or page elements that can obstruct a reader’s view of content. These are often used for promotional purposes but can disrupt readers and obfuscate the words on the page. This can make sites hard to use, so visitors will be unlikely to revisit the site.
These days, more and more users visit websites on mobile devices, so Google prefers mobile-friendly pages. This signal is not applicable for desktop searches, so Google will base the desktop signal on the URLs viewed by desktop users in cases where a site has separate mobile and desktop URLs.
In cases where a search produces a high volume of potential matches that bear relatively equal relevance, preference will be given to the content attached for a better page experience.
Passage Ranking System
Google understands that it can be challenging to deliver accurate results for precise searches, given that the exact sentence that answers the question could be buried deep within a particular web page. Therefore, they have started using a passage ranking system to interpret specific passages’ relevancy better. They liken this to searching for a needle in a haystack and boast that it can improve 7% of the search queries they receive across all languages.
Product Reviews System
From time to time, Google updates its automatic ranking systems to ensure they provide high-quality product reviews with detailed research instead of essential summaries of an extensive range of products that fail to deliver the details searchers seek.
These regular product review updates help to refine their ability to reward high-quality reviews with higher search positions. They evaluate review content at the page level, look for reviews written by enthusiasts or experts with deep knowledge of the topic, and provide a lot of insight and analysis. Although they use structured data to help them identify product reviews, they do not rely on structured data alone.
When Google’s RankBrain AI tool launched in 2015, it marked the first time a deep learning system was used in the search. It represented a huge step forward in understanding how words relate to concepts. Although this is something that humans can easily understand, computers often need help to make these connections. With RankBrain, however, Google was able to understand better how the words people use in search queries relate to concepts in the real world.
Google’s blog cites the example of a search for the phrase “what’s the title of the consumer at the highest level of a food chain.” When Google’s systems see these words across different pages, they learn that food chains could be related to animals rather than human consumers and can then match the words to related concepts to understand that the answer the individual is looking for is an “apex predator.”
RankBrain also helps Google better identify relevant content, even if it does not happen to contain every word the searcher uses. It uses the connections between terms and concepts to establish a relationship.
Reliable Information Systems
Google constantly uses multiple systems to find the most relevant and dependable information. Their plans continuously strive to identify more authoritative pages and promote high-quality journalism while demoting lower-quality content.
There may occasionally be searches for which it is challenging to return reliable information. In these cases, Google’s systems are set to show users content advisories when they do not have a high degree of confidence in the quality of the available results for that particular search or the situation is changing quickly. They also advise users on search methods that could provide more beneficial effects.
Site Diversity System
Google has some safeguards known as site diversity that avoid situations where more than two web pages from a single site will appear in the top results. This prevents a specific location from dominating all of the top results. Nevertheless, if their systems determine that multiple pages on the same site are particularly relevant for a search, they may occasionally show more than two listings.
Google’s site diversity system considers subdomains to be a part of a particular root domain for these purposes by default. Still, it may occasionally make an exception and treat subdomains as separate sites for diversity reasons.
Spam Detection Systems
Just as many people’s e-mail services provide spam filters, Google search also uses filters to help it avoid the need to sort through spam, which can harm its ability to show users the most relevant and valuable results for a search query. They use spam detection systems such as SpamBrain to identify content and activity that violates their policies against spam. Google regularly updates these systems to keep up with evolving spam tactics.
One of the most impressive solutions in Google’s arsenal in the fight against spam is SpamBrain. Thanks to this AI-based spam prevention system, Google caught 200 times the number of spam sites in 2021 as it did when it first got off the ground almost 20 years ago.
Launched in 2018, SpamBrain quickly made a name by identifying many more spam sites than previous tools. Its introduction has led to a 70% reduction in hacked spam and a 75% reduction in gibberish spam found on hosting platforms.
Because spammers are becoming increasingly sophisticated in response to the efficacy of these tools, SpamBrain continues to improve its ability to identify malicious and disruptive behaviors. It has helped Google keep more than 99% of its searches free of spam.
Using Known Algorithm Information Within Your SEO Campaign
Understanding the key signals and factors within Google’s search ranking systems can help businesses improve their SEO campaigns. Google has provided a lot of helpful information that can guide decisions about how websites are built and the content they contain to rank higher in search results.
However, these factors are constantly evolving, and a deeper understanding of the interplay between so many different vital signals is needed to make the most of this information. The SEO professionals at 321 Web Marketing draw on their years of experience and insights to help businesses maximize their SEO efforts efficiently and effectively.
We have worked with countless businesses across various industries to improve their national and local SEO and ensure their sites offer visitors the best experience possible. To find out more about how our services can help your business stand out in search and reach more potential clients, contact the SEO team at 321 Web Marketing today.