You see, in the rest branches like email marketing, social media, content marketing, or PPC, you know what you have to do to perform well.
However, in SEO the ranking factors used by search engines are not openly disclosed, in the case of Google. And this is what makes it so cryptic – and more difficult than it should be.
The result? Many SEO professionals don’t always know which ranking factors they should consider and whether the factors they spend their time optimizing matter in the first place. So more often than they’d like to admit, they play the guessing game.
Here’s our take on the most discussed and controversial SEO ranking factors, being fact🐴 or fiction🦄.
Anchor text is the visible, clickable text in a hyperlink. It can provide both search engines and users with relevant contextual information about the content of the link’s destination.
Anchor text provides contextual information about the linked page’s content. When search engines crawl a page, they analyze the anchor text to determine the relevance of the linked page to the anchor text and the page it’s linking from.
Relevant anchor text helps search engines understand the topic and theme of the linked page, which can positively impact its ranking for related search queries.
Anchor text serves as a navigational cue for users, indicating what they can expect when they click on a link. Clear and descriptive anchor text helps users understand where they will be directed, improving their overall browsing experience.
When users find value in the linked page, they are more likely to spend time on the site, reducing bounce rates and increasing engagement metrics, which can indirectly impact SEO rankings.
Search engines look for a natural link profile, which includes a diverse range of anchor text variations. A healthy mix of anchor text types, such as branded, generic, keyword-rich, and partial-match anchors, suggests that the links to a page are acquired naturally and not through manipulative tactics.
Having a diverse anchor text profile helps build trust with search engines and reduces the risk of being penalized for over-optimization.
Search engines have evolved to prioritize content quality, relevance, and user signals such as click-through rates, dwell time, and engagement metrics. The focus has shifted from relying solely on external signals like anchor text to understanding the actual content on the page and the user experience.
While anchor text can still provide contextual information, its impact on rankings may be less significant compared to other factors.
Search engine algorithms have become more sophisticated and now employ advanced natural language processing techniques to better understand the content and context of web pages.
They can analyze the overall topical relevance of a page, assess the semantic meaning of words and phrases, and identify relationships between different pieces of content without relying heavily on anchor text alone.
Historically, anchor text was often manipulated through keyword stuffing or unnatural linking schemes to artificially boost rankings. Search engines have become more adept at detecting such manipulative tactics and have implemented penalties to discourage these practices.
This has led to a more cautious approach towards anchor text optimization, as over-optimization or excessive use of keyword-rich anchor text can now hurt rankings.
Anchor text is indeed important because it helps search engines understand what a webpage is about. Using descriptive anchor text when inserting links on a page is a best practice listed in Google’s SEO Starter Guide. According to these guidelines, “With appropriate anchor text, users and search engines can easily understand what the linked pages contain.”
Bounce rate is a metric used in website analytics to measure the percentage of visitors who leave a website after viewing only one page, without interacting or exploring further. In other words, it represents the rate at which visitors “bounce” away from a website.
Search engines aim to deliver the most relevant and valuable results to users. High bounce rates may indicate that a webpage fails to meet user expectations or provide the desired information.
If a significant number of users quickly leave a page without engaging further, it suggests that the content, design, or user experience may be subpar. To ensure user satisfaction, search engines are likely to consider bounce rate as a signal of relevance and quality.
Bounce rate is closely related to dwell time, which is the amount of time a user spends on a webpage before returning to the Search Engine Results Pages (SERPs). Longer dwell times generally indicate higher engagement and suggest that users find the content valuable and relevant.
Search engines may interpret longer dwell times as a positive signal, indicating that the page provides useful information and satisfies user intent. Consequently, websites with lower bounce rates are more likely to have longer dwell times and, potentially, higher rankings.
Analyzing bounce rates can help search engines fine-tune their algorithms and improve the accuracy of their search results. If a particular page consistently has a high bounce rate for a specific query, it suggests that the page is not meeting user expectations or addressing their needs effectively.
By incorporating bounce rate as a factor, search engines can refine their understanding of user intent and deliver more precise results, thereby enhancing the overall search experience.
Bounce rate is a metric that measures the percentage of users who leave a website after viewing only one page. While a high bounce rate may indicate that users did not engage further with the site, it doesn’t necessarily provide insight into the reasons behind the bounce. Users may have found the information they were seeking, contacted the business through other means, or completed a desired action without navigating to another page.
Consequently, bounce rate alone does not establish a direct causal relationship between user behavior and website quality.
Users visit websites with diverse intentions, and their behavior can vary depending on the context. For example, if a user performs a specific search and lands on a page that provides an immediate answer to their query, they may leave the site without further exploration, leading to a high bounce rate.
However, this doesn’t necessarily indicate a negative user experience or poor content quality. Bounce rate does not take into account the relevance of the page to the user’s search query or their overall satisfaction with the content consumed.
Bounce rate measurements can be influenced by various factors that may not be indicative of the website’s quality. Technical issues, slow loading times, misleading search snippets, or external factors beyond the website owner’s control can impact the bounce rate. Additionally, the interpretation of bounce rate can be subjective.
For instance, some websites, such as blogs or news sites, may have a higher bounce rate due to users consuming a single article before leaving. However, this doesn’t necessarily indicate a negative experience or lack of engagement.
On the one hand, Google has repeatedly said that bounce rate does not directly influence page rankings. For example, a few years ago, it was mentioned that “I think there’s a bit of misconception here that we’re looking at things like the analytics bounce rate when it comes to ranking websites, and that’s definitely not the case.”
However, Google’s API leaks that happened in May have shed more light on this Google ranking factor. Accordingly, the leaks highlighted that Google uses a variety of user engagement metrics, including bounce rate, to assess the quality of web pages. For instance, factors like “goodClicks,” “badClicks,” and “dwell time” (how long a user stays on a page) are considered.
This indicates that Google does take into account how users interact with search results and websites to some extent.
Moreover, the documents suggest that poor user engagement, such as high bounce rates, can lead to demotions in search rankings. This aligns with the idea that Google aims to promote content that keeps users engaged and satisfied.
CTR in SERPs refers to the percentage of users who click on a specific search result out of the total number of users who view that result.
Search engines aim to provide the most relevant and valuable results to users. Click-through rate can be seen as a user behavior signal that reflects how appealing a relevant a research result is.
If a particular search result consistently receives a high CTR compared to others, it suggests that users find it compelling and relevant. Search engines may interpret this as a positive signal and potentially adjust rankings to prioritize pages that generate higher CTR.
CTR data provides valuable insights into user preferences and satisfaction. Search engines can analyze patterns in CTR to better understand which search results are most appealing to users for specific queries.
By incorporating CTR as a factor, search engines can refine their algorithms and improve the accuracy of their search results, aligning them more closely with user intent. This iterative process helps search engines continuously deliver more relevant and engaging results.
High CTR can be an indicator of brand authority and trustworthiness. Well-established brands often enjoy higher CTR as users recognize and trust their reputation.
When a brand consistently generates high CTR for relevant queries, it suggests that users trust the brand and perceive its content as valuable. Search engines may consider this as a signal of authority and potentially reward the brand with improved rankings.
CTR does not guarantee the quality or relevance of a webpage. A high CTR can be influenced by various factors such as compelling titles, attractive snippets, or positioning in search results.
However, these elements do not necessarily reflect the actual content’s value or alignment with user intent. Search engines primarily aim to deliver high-quality and relevant results, which are determined by other factors such as content relevance, authority, and user engagement.
CTR can be affected by external factors beyond a website owner’s control. For instance, user preferences, search intent, and the positioning of search results can all impact CTR.
Additionally, different search engines or platforms may measure CTR differently, making it difficult to establish a consistent and reliable metric for ranking purposes. Relying solely on CTR as a ranking factor would lead to inconsistencies and potential manipulation.
If CTR were a direct ranking factor, it would be susceptible to manipulation through unethical practices such as click fraud or artificially boosting CTR through paid methods.
This could undermine the integrity of search results and lead to a poor user experience. Search engines prioritize factors that are more difficult to manipulate, such as content quality, relevance, user engagement signals, and authority, to ensure fair and accurate rankings.
According to Google’s Gary Illye, CTR is not one of Google’s ranking factors. As he stated a few years ago, “If you think about it, clicks in general are incredibly noisy. People do weird things on the search result pages. They click around like crazy, and in general, it’s really, really hard to clean up that data.”
However, just as with the bounce rate as a factor, Google’s API leaks indicate otherwise. The leaked documents revealed that user interaction metrics, including CTR, play a significant role in how Google ranks pages. Specifically, metrics like “goodClicks” and “badClicks” are used to assess the quality and relevance of search results.
These documents suggest that pages with higher CTRs, indicating they attract more clicks from users, are likely to be favored in rankings. This aligns with Google’s broader emphasis on user engagement and experience as crucial elements in their ranking algorithm.
The content length is pretty self-explanatory. There are two questions regarding content length:
a. does short-length content rank, and
b. how much is too much?
Longer content has the potential to provide more comprehensive coverage of a topic. Search engines strive to deliver high-quality and relevant results to users. In many cases, longer content has the ability to address user intent more thoroughly by including in-depth information, addressing related subtopics, and providing a comprehensive resource for users.
This can lead to increased user satisfaction and engagement, which are positive signals for search engines.
Longer content offers more opportunities for incorporating relevant keywords and related terms naturally. Search engines analyze the presence and frequency of keywords within content to determine their relevance to specific search queries.
With longer content, there is a higher likelihood of naturally including relevant keywords and providing context around them. This can improve the content’s visibility for target keywords and enhance its chances of ranking higher in search results.
Longer content often has a higher potential to attract backlinks from other websites. When content is comprehensive, informative, and valuable, it tends to be cited and linked to by other online sources. Backlinks are important signals of authority and trust to search engines.
By providing long-form content that offers unique insights or comprehensive resources, there is a higher likelihood of attracting backlinks from reputable websites, which can positively impact SEO rankings.
Search engines prioritize delivering relevant and high-quality content to users. While longer content may offer the opportunity to include more information, it does not guarantee its relevance or quality.
Search engines primarily assess the relevance of content based on factors like keyword usage, semantic analysis, and user signals. A shorter, concise piece of content that directly addresses user intent and provides valuable information can often outrank longer but less relevant or poorly written content.
The focus is on providing the best user experience and fulfilling user intent. Content that effectively meets user needs, engages readers, and encourages them to take desired actions tends to rank well.
The length of the content itself is not a direct indicator of user engagement or satisfaction.
Different industries, topics, and user intents require varying lengths of content. For certain queries or topics, shorter content may be preferred by users who seek quick answers or information. In contrast, more complex or comprehensive subjects may necessitate longer-form content.
Search engines take into account the context and user preferences for specific topics, rather than enforcing a rigid rule for content length across the board.
There’s no real benefit to extending the length of content to fit an arbitrary word count.
There are more important content-related factors to look out for, including a solid content structure, quality of information, visual support, and imagery.
As Google’s John Muller replied in reddit.com/r/bigseo/ for the question “How to find word count in SERP?”, “Word count is not a ranking factor. Save yourself the trouble.”
Older domains tend to have more established histories, which can contribute to building trust and authority with search engines.
A website that has been around for a longer period is more likely to have accumulated quality backlinks, developed a solid content base, and gained a loyal user base. This longevity can signal reliability to search engines, leading to higher rankings.
Backlinks play a crucial role in SEO, as they are considered as votes of confidence and credibility from other websites. Over time, older domains tend to accumulate a greater number of backlinks naturally.
These backlinks can help enhance the website’s authority and improve its search engine rankings. While it’s not solely dependent on domain age, older domains generally have more opportunities to acquire high-quality backlinks.
The age of a domain often coincides with the amount of content that has been published over time. Older domains tend to have a larger number of indexed pages and a more extensive content library.
Search engines value websites with a robust content base as they offer more value to users. By consistently adding new, high-quality content, older domains can demonstrate their expertise and relevancy, ultimately boosting their SEO rankings.
Search engines primarily aim to provide the most relevant and useful results to users. Domain age, on its own, does not guarantee relevance or quality content. A newer domain may have fresh and updated content that is more relevant to a specific search query compared to an older domain with outdated or less useful information.
Search engines focus more on content quality, relevance, and user experience rather than solely relying on the age of the domain.
Search engine algorithms constantly evolve to provide better search results. Over time, search engines have become more sophisticated, relying on numerous factors to determine rankings.
While domain age may have held more weight in the past, search engines now consider a wide range of factors, such as content quality, user engagement metrics, and backlinks to assess the relevance and authority of a website. These factors often hold more importance than domain age alone.
Sometimes, website owners acquire existing domains with established age to launch new projects or businesses. In such cases, the domain age does not reflect the actual history or credibility of the current website.
Search engines understand this and do not solely rely on domain age but instead analyze the overall quality, relevance, and user satisfaction signals of the website’s current iteration
Historically, Google has consistently downplayed the significance of domain age in its ranking algorithms. The official stance has been that the age of a domain is not a ranking factor, emphasizing instead the quality and relevance of the content.
Yet again, Google’s API leaks prove otherwise. The leaked information reveals that Google indeed considers domain age through a parameter known as “hostAge.” This parameter is used to sandbox new or untrusted websites.
To conclude: The impact of domain age on a site’s ranking primarily affects newer domains and not established ones.
Dwell time is the amount of time a user spends on a webpage after clicking on a search result before returning to the search results page. It is a metric that measures the engagement and satisfaction of users with the content they find.
Dwell time is considered by some as an indicator of user engagement and satisfaction. If users spend a significant amount of time on a page, it suggests that the content is relevant and meets their needs.
The mission os search engines, like Google, is to deliver the most relevant and useful content to users, so considering dwell time can be seen as a way to gauge content quality.
Websites that keep users engaged and provide a positive experience tend to have longer dwell times. If a website offers valuable and engaging content, users are more likely to stay on the page, read the content more thoroughly, and potentially explore other pages aswell.
Considering dwell time as a ranking factor can encourage website owners to focus on improving user experience, resulting in better overall satisfaction and longer session durations.
Dwell time can help differentiate high-quality content from low-quality content. If users quickly bounce back to the search results page after clicking on a link, it could indicate that the content did not meet their expectations or provide the desired information.
By considering dwell time, search engines can assess which pages are more likely to satisfy user intent and deliver a positive experience, thus encouraging website owners to produce high-quality and engaging content.
While search engines can track certain user behavior metrics like click-through rates, they may have limited access to data on how long a user stays on a page. Dwell time can be challenging to measure accurately, as search engines cannot directly track user activity once they leave their search results page.
Dwell time can be influenced by various external factors that are unrelated to the quality or relevance of the content. Factors like slow internet connection, distractions, or multitasking can affect how long users stay on a page, regardless of the content’s quality.
Using dwell time as a ranking factor may inadvertently penalize websites for factors beyond their control.
Dwell time alone may not provide sufficient context to evaluate content quality. Users may spend a long time on a page because they are struggling to find information or because the content is difficult to understand.
Conversely, users may find a quick answer and leave satisfied. Dwell time alone does not reveal the true value or relevance of the content to the user’s query.
Pages with higher dwell times are generally seen as providing more value to users and are likely to rank higher in search results. The latest Google API leaks included various user engagement signals, such as “goodClicks” and “lastLongestClicks”.
Overall, Google’s API leaks affirm that providing a positive user experience by creating valuable, relevant, and engaging content is essential for achieving better rankings.
Domains from the government and education space are thought to be more trustworthy. Therefore, when a government agency or an educational institution refers back to your domain, then it has a bigger impact than links from other domains. Or not?
.gov and .edu domains are typically associated with government and educational institutions, respectively. These domains are considered authoritative and trustworthy sources of information.
When these domains link to other websites, search engines interpret it as a vote of confidence from respected entities. Such backlinks can significantly enhance the credibility and trustworthiness of a website, which can positively impact its SEO rankings.
Government and educational institutions are known for producing high-quality and reliable content. Search engines strive to deliver the most relevant and useful information to users, and links from .gov and .edu domains indicate that the linked website contains valuable content.
These domains often have strict editorial guidelines and standards, ensuring that the information provided is accurate, reliable, and backed by authoritative sources. Search engines recognize the value of linking to such reputable sources, which can result in improved SEO rankings.
Acquiring backlinks from .gov and .edu domains can be challenging since these domains typically have stringent linking policies. Consequently, when a website manages to secure links from such domains, it indicates that the content is genuinely noteworthy and valuable.
The difficulty in obtaining these links suggests to search engines that the website has earned its backlinks through merit rather than manipulative tactics. As a result, search engines may reward the website with higher rankings due to the perceived quality and authenticity of the acquired links.
Search engines prioritize relevance and context when evaluating the quality and authority of a link. While .gov and .edu domains are generally associated with reputable institutions, the relevance of a link is crucial.
If a .gov or .edu link is not contextually relevant to the content it is linking to, search engines may not assign much weight to it in terms of SEO rankings. The focus is on the quality and relevance of the content itself, rather than solely relying on the domain extension.
While .gov and .edu domains are typically considered trustworthy, they are not immune to spam or manipulation. There have been instances of link schemes and black hat SEO practices targeting these domains to artificially boost rankings.
Search engines are aware of these practices and have implemented algorithms to detect and penalize such manipulative tactics. Therefore, simply acquiring .gov or .edu links without genuine relevance or value may not significantly impact SEO rankings.
Search engines prioritize a diverse and natural backlink profile that indicates a website’s authority and popularity. While .gov and .edu links can be valuable, relying solely on them for link building may result in an imbalanced backlink profile.
A healthy link profile should include links from various high-quality domains across different industries and sectors. Diversifying the backlink profile helps demonstrate a website’s credibility and relevance to a broader audience, rather than solely relying on a few specific domain extensions.
A backlink is a backlink and backlinks are confirmed ranking factors. While .gov and .edu links can have some positive impact on SEO rankings, their significance should be viewed in the context of relevance, diversity, and overall quality of a website’s backlink profile. In other words, they don’t offer more value than other types of links.
Building a comprehensive and diverse backlink profile with a focus on high-quality content remains essential for successful SEO strategies.
Page speed is widely recognized as an important factor for SEO rankings. But is it really a ranking factor or is it just a myth?
Page speed directly impacts user experience and satisfaction. Users expect fast-loading webpages and tend to abandon sites that take too long to load. Slow-loading pages can lead to high bounce rates and low engagement, negatively impacting user experience.
Search engines prioritize delivering the best user experience and are more likely to rank websites that provide faster and more responsive pages.
With the significant increase in mobile internet usage, mobile optimization has become crucial for SEO. Mobile devices often have limited resources and slower internet connections compared to desktops.
Therefore, fast-loading pages are even more critical for mobile users. Search engines prioritize mobile-friendly websites and consider page speed as an important factor in determining mobile search rankings. Websites that offer better mobile experiences by optimizing their page speed are more likely to rank higher in mobile search results.
Search engine bots have limited resources and time to crawl and index web pages. Faster-loading pages allow search engine crawlers to efficiently crawl and index a larger number of pages within a given timeframe.
When search engines can access and analyze a website’s content quickly, it increases the chances of the website’s pages being indexed and ranked more effectively. Therefore, optimizing page speed can enhance the visibility and discoverability of a website’s content in search engine results.
Search engines primarily focus on delivering relevant and high-quality content to users. While page speed can impact user experience, it does not directly correlate with the relevance or quality of the content itself.
Page speed, although important for user experience, does not inherently indicate the quality or relevance of the content.
Internet speeds vary across different regions and mobile networks. While page speed may be affected by factors such as server response time and page optimization, it is also influenced by external factors beyond the website owner’s control.
Users accessing a website from different locations or using different mobile networks may experience varying page speeds. Penalizing websites based on page speed may unfairly disadvantage websites in areas with slower internet connections, even if the website itself is well-optimized.
Search engines prioritize user satisfaction and engagement metrics when determining search rankings. While page speed can contribute to a positive user experience, it is not the sole factor that determines user satisfaction.
Other factors such as content relevance, ease of navigation, mobile-friendliness, and overall user engagement play significant roles in determining how satisfied users are with a website. Search engines consider a wide range of user satisfaction metrics rather than relying solely on page speed.
Page speed as a Google ranking factor aligns with Google’s commitment to delivering the best user experience.
Faster-loading pages enhance user satisfaction and crawl efficiency, and provide a competitive advantage. By prioritizing page speed optimization, website owners can indeed improve their chances of achieving higher search engine rankings and attracting more organic traffic.
When users find content valuable or interesting, they may choose to share it with their social network by posting a link or sharing the content directly. Social shares indicate the popularity and engagement of a piece of content within the social media sphere.
Social shares refer to the act of users sharing content from a website or webpage on social media platforms such as Facebook, Twitter, LinkedIn, or Pinterest.
When users find content valuable or interesting, they may choose to share it with their social network by posting a link or sharing the content directly. Social shares indicate the popularity and engagement of a piece of content within the social media sphere.
Social shares can amplify the reach and visibility of content. When content is shared on social media, it has the potential to be seen by a larger audience beyond the website’s existing visitors.
More visibility can lead to increased traffic to the website, which can positively impact SEO rankings.
Social shares can help build brand authority and awareness. When content is widely shared and recognized on social media, it can enhance the brand’s visibility and reputation.
Increased brand awareness can lead to more branded searches and direct traffic to the website, both of which are positive signals for search engines and can contribute to improved SEO rankings.
The number of social shares alone does not necessarily indicate the quality or relevance of the content.
While social shares can indicate popularity, the actual value of the content should be assessed based on other factors such as user engagement metrics, content depth, and relevance to the search query.
Social shares can be manipulated through various means, such as paid social media promotion or artificial engagement tactics. This can lead to inflated social share counts that do not necessarily reflect genuine user interest or content value.
Search engines aim to provide accurate and reliable search results, and relying heavily on social shares as a ranking factor may introduce biases and inaccuracies.
Social media platforms have varying degrees of accessibility and impact across different industries and demographics. While social media is widely popular, not all industries or target audiences heavily rely on social media for content discovery and engagement.
Placing excessive emphasis on social shares as an SEO ranking factor may overlook the relevance and impact of a website or content in specific niches or markets.
While social signals may indirectly impact SEO, such as by generating backlinks or increasing visibility, Google does not use the number of social shares as a direct ranking signal in its algorithm.
What’s more, there’s is no new significant information from the leaks regarding social shares directly affecting rankings.
Google RankBrain is an artificial intelligence (AI) system developed by Google to enhance the search engine’s ability to understand and interpret user queries. Effectively, it’s a machine learning-based algorithm that helps Google process and deliver more relevant search results.
RankBrain utilizes machine learning to understand and interpret user search queries. It aims to provide more accurate and relevant search results by understanding the underlying intent behind a search query rather than relying solely on keyword matching.
RankBrain analyzes various factors, including user behavior, click-through rates, and dwell time, to assess the relevance and quality of search results. By considering user intent, RankBrain helps search engines deliver more precise results, improving the overall search experience.
RankBrain helps search engines better understand the meaning and context of search queries. It can identify relationships between words, concepts, and entities to provide more comprehensive search results.
This capability allows RankBrain to interpret complex and ambiguous search queries and provide relevant results even when the exact keyword match is not present.
RankBrain is designed to adapt and learn from new data and user interactions over time. It continually analyzes user behavior and adjusts its understanding of search queries and ranking factors. This adaptability allows RankBrain to adapt to changing search patterns, new search terms, and emerging trends.
The dynamic nature of RankBrain contributes to the ongoing improvement of search engine rankings.
Google has not provided explicit details about how much weight RankBrain carries in the overall ranking algorithm. As a result, some SEO professionals and website owners may question its impact and consider it less influential compared to other known ranking factors. The lack of transparency makes it difficult to gauge the precise importance of RankBrain in relation to other factors.
That’s it; that’s the only reason someone can argue against Rankbrain being an SEO factor.
Google RankBrain is a ranking factor because it helps Google better understand user search queries and their intent, allowing for more accurate and relevant search results.
By analyzing user behavior signals and learning from patterns, RankBrain continuously adapts and improves its understanding of complex and ambiguous queries, ensuring that search results align with user expectations.
Many of these ranking factors were considered fiction in the recent past. However, since May’s Google API leaks the tables have turned, and what was once considered not a significant factor, now is.
The TL;DR of Google’s SEO ranking factors looks like this:
Do you need help ranking higher or with other parts of your SEO strategy?
Contact us and let’s see how we can help increase your organic traffic!
I write for GrowthRocks, one of the top growth hacking agencies. For some mysterious reason, I write on the internet yet I’m not a vegan, I don’t do yoga and I don’t drink smoothies.
There's a massive rise in the use of programmatic media and advertising for digital marketing.…
B2B vs B2C Marketing: are they that different? In some ways, yes. From price sensitivity…
A list with some of the best performance marketing agencies that are experts in PPC,…
Page load speed, a compelling headline, social proof, and a clear call to action. All…
AI knowledge or prompt engineering alone does not equip teams to lead in the AI…
Democratizing Innovation is the potential ability of users to develop what they need themselves. AI…