21 Truths and Myths About Ranking Websites in 2020

Since Google hit the search engine field in 1998, it has dominated Internet search ever since. And since Google is the de-facto leader in search . . .

Since Google hit the search engine field in 1998, it has dominated Internet search ever since. And since Google is the de-facto leader in search it’s only prudent that SEO professionals pay attention to the search engine giant’s changes in ranking websites.

In order to determine the important ranking factors going forward, we need to tell the difference between ranking truths and myths. Here are 21 truths and myths about ranking websites in 2020.

1. Domain Age and Registration Length

The myth.
Domain age has been considered a ranking factor for many years. It is believed that older, established domains will rank better than newer ones. It is also believed that domain names with longer registration periods (2 or 3 years), look more legitimate in the eyes of Google.

The truth.
For many years Google has denied that domain age has any bearing in ranking. In 2010, Matt Cutts of Google clearly stated that there was no difference between a domain that is 6 months old vs one that is 6 years old. Google’s John Muller re-affirmed Cutt’s statement and added that registration length is also not a ranking factor.

My take.
I have heard these ranking myths for a long time. It is true that a new domain will be harder to rank for awhile, but not because Google puts value towards older domains. It is because new domains don’t have any Domain Authority (DA) and that takes time to achieve.

2. Keywords in the Domain

The myth.
For a long time we were told that having keywords in the domain will enable a site to rank better and faster. Their reasoning was that the domain keywords offered a relevancy signal to Google.

The truth.
John Mueller dis-spelled this myth by saying: “… just because keywords are in a domain name doesn’t mean that it’ll automatically rank for those keywords. And that’s something that’s been the case for a really, really long time.”

My take.
If the SEO is done correctly on a site, including well written content, it will naturally rank for any keywords that might be in the domain. People were making a wrong correlation and it was accepted as fact.

3. Keywords in a Page’s URL

The myth.
If you use a keyword in a page’s URL, that was considered a ranking signal. And, these URLs may serve as their own anchor text when shared.

The truth.
John Mueller has said that having a keyword in a page’s URL was a ranking signal, but a small one. Mueller went on to say, “I believe that is a very small ranking factor. So, it is not something I’d really try to force. And it is not something I’d say is even worth your effort to restructure your site just so you can get keywords in your URL.”

My take.
A URL containing keywords may not be a ranking factor, but it possibly could be a factor in CTR (Click Through Rate). We must never think just about ranking, but also how a user will react with the words we use not only in URLs, but in all the content that we use.

4. Keywords in the Title Tag

The myth.
The title tag tells visitors and search engines what the topic is about a section of the web page. The contents of this title acts as a ranking factor as long as the title relates to the underlying content.

The truth.
In 2016 John Mueller said, “Titles are important! They are important for SEO. They are used as a ranking factor. Of course, they are definitely used as a ranking factor, but it is not something where I’d say the time you spend on tweaking the title is really the best use of your time.”

My take.
SEO professionals should still pay attention to keyword-rich titles because they are a ranking factor, but it is less important than it used to be. We need to just make sure these titles are relevant to the page.

5. Keywords in the Description Tag

The myth.
The description tag is a short description of what the web page is about. It is important to include keywords for ranking and entice viewers to click on that page.

The truth.
In 2009 Google stopped using the description tag in their ranking algorithm, along with the separate keyword tag. Google says that it sometimes uses the description tag for featured snippets, but they don’t use it for ranking a page.

My take.
The description tag is still important for SEO even though it is not a ranking factor for Google. A well written description is valuable for improved CTR.

6. Keywords in the H Tag

The myth.
The H tag, or heading tag, are a series of headings labeled H1 to H6 in the HTML language of a web page. The most important heading will be labeled H1 while the least important would be H6. These tags enable search engines to grasp the idea of a page’s structure. Putting keywords in these tags can improve the ranking for that page.

The truth.
In 2015 John Mueller addressed this issue by saying in part, “…we do use it to understand the context of a page better, to understand the structure of the text on a page, but it’s not the case like you’d automatically rank 1 or 3 steps higher just by using a heading.”

My take.
While not a ranking factor, using keywords in heading tags is still a good idea because it gives search engines a better idea of the structure of the content on a page and it helps readers on what to expect in the paragraphs below that heading.

7. Image Alt Tags

The myth.
Alt tags are the alternative tags for images. Often overlooked by web creators, these tags are important in describing what the image is about. Alt tags are important to search engines because they associate images with a web page’s content and are likely to appear in image search results.

The truth.
Google has stated that Alt tags are extremely helpful for ranking
in Google Images. They seem to have little value in the overall ranking for a given web page.

My take.
Even though Alt tags don’t appear to be a direct ranking signal for web pages in general, they are still important in Google Image searches and should be included for every image on your site.

8. TF-IDF

The myth.
TF-IDF is short for Frequency-Inverse Document Frequency. This fancy term is the measurement of the importance a keyword phrase has compared to the frequency of the term in a large set of documents. Many top SEO firms use this formula to determine the impact of keyword phrases as it relates to ranking. It is believed that Google uses this method to evaluate the relevance of a web page’s content.

The truth.
Again we return to the expert, Google. They admit that TF-IDF has been used for a long time, but now other methods are in place. John Mueller puts it in context: “With regards to trying to understand which are the relevant words on a page, we use a ton of different techniques from information retrieval. And there’s tons of these metrics that have come out over the years.”

My take.
Statements from Google affirms that TF-IDF is an outdated method of keyword retrieval and confirms that Google is moving towards other methods that don’t rely on keywords so much, but rather the relevance of words on a page. Surprisingly, well regarded websites, such as SEMrush, still touts the use of TF-IDF in identifying gaps in content for web pages. While the use of TF-IDF may still have some importance in solving ranking problems, time is better spent creating content with users in mind, which is what Google is now looking for.

9. Content Length

The myth.
According to extensive studies, there is a correlation between long content out-ranking shorter content. Content that is between 1600 to 2000 words in length tend to rank higher than and attain more traffic than shorter content. SEO guru Brian Dean came up with 1,890 words as the sweet spot for content length.

The truth.
Here’s what John Mueller says about ranking and word count: “Word count is not a ranking factor. Save yourself the trouble.”

My take.
The problem with these “extensive studies” by well regarded SEO specialists, is their assumptions based on their collection methods. They are not scientific and have not been sufficiently scrutinized or peer reviewed. As a result, SEO professionals end up chasing rainbows in an attempt to increase rankings.

The reason longer content ranks higher is that it covers a topic in more depth, provides more value, and answers the user’s questions more effectively than short content.

10. Fresh Content

The myth.
Google will rank sites higher when they are updated with new or fresh content.

The truth.
Google’s update in 2009, called the “Caffeine Update”, was essentially a new system for crawling the web and indexing sites. It was designed to provide fresher results, especially for frequently updated content, such as news, politics, and trends. Google has denied giving a ranking boost to those sites with frequently updated content.

My take.
Even though Google’s John Mueller dismisses the idea that newer content equals better rankings, there is a part of Google’s ranking algorithm called “Quality Deserves Freshness” or QDF. It essentially means “search requests that deserve up-to-date search results.” When Google sees an update to a current topic, it tends to rank that updated content higher in the SERPs (Search Engine Results Page). That doesn’t mean that all “fresh” content will rank better, only that content that Google deems as a “hot” topic. In other words, it has to be new content that is popular on the web.

11. LSI Keywords

The myth.
LSI stands for Latent Semantic Indexing. It is an information seeking method that finds hidden relationships between keywords that search engines use to understand the deeper meaning for content on a web page. It is believed that these semantically related words tend to impact ranking for a given page.

The truth.
Here’s what John Mueller says about LSI: “There’s no such thing as LSI keywords — anyone who’s telling you otherwise is mistaken, sorry.” It seems that Google uses much more modern ways of determining quality of content on a web page.

My take.
I don’t know where LSI keywords originated from, but it seems like it is an old method of information retrieval. But, many SEO pros, such as Brian Dean from BackLinko, still write posts about the importance of LSI keywords.

12. Valid HTML

The myth.
HTML stands for Hyper Text Markup Language. which is the code behind websites. It is believed that HTML errors signal to search engines that the site is of poor quality. On the other hand, valid HTML helps a site rank better.

The truth.
Google has many formatting and style rules for HTML and CSS (Cascading Style Sheets) aimed at crawling your website better. It is not a matter of improving ranking on based on how good your code is, it is about creating code that Google likes so your site can rank well.

My take.
HTML is not a ranking factor, but it is clear that valid HTML affects how Google crawls your site and how your site is rendered in a browser, which is important.

13. Structured Data Markup

The myth.
Structured Data Markup (schema.org) helps you mark up elements on a web page so that Google can understand the data on the page. Once Google understands your page data more clearly, it can be presented more attractively and in new ways in Google Search.

There are many types of markup, such as events, names, dates, addresses, or products.

Using structured data on a website can result in having a featured snippet in search results. It is also believed that this can increase rankings.

The truth.
This is what John Mueller had to say about the use of structured data: “There’s no generic ranking boost for SD usage. However, SD can make it easier to understand what the page is about, which can make it easier to show where it’s relevant (improves targeting, maybe ranking for the right terms). (not new, imo)”

My take.
It is clear that structured data is not a ranking factor. But, it seems to help websites rank indirectly. Helping search engines better understand your content always helps, and as a plus, it could result in a featured snippet or rich result.

14. Server Location

The myth.
Server location is where the data for your website resides. If a user is far away from your server’s location it may take longer for pages to load for that user. Many believe that having your site hosted in the home country your target audience lives in increases the likely-hood of ranking higher.

The truth.
The truth is rather muddled on this one. Some years ago Google’s Matt Cutts hinted at the probability of better rankings if your server is hosted in the country you target audience lives in, but it is not real clear.

My take.
Having your website’s server in the country where most of your visitors are is important, but not for location reasons. It helps to have the server closer for page speed, which is a ranking factor.

15. XML Sitemap

The myth.
An XML Sitemap, which is invisible to viewers, is important because it helps search engines crawl your website and find all your pages. If present, it can boost your ranking.

The truth.
Back in 2008, Trevor Foucher of the Google’s Webmaster Tools Team answered that question by saying, “a sitemap truly doesn’t affect the actual rankings of your web pages.”

My take.
Sitemaps are like a guide that leads the search bots to find every page on your site so it can be indexed.

16. AMP

The myth.
Accelerated Mobile Pages, or AMP for short, was endorsed heavily by Google a few years ago. Essentially, they make web pages lighter and faster loading for mobile devices. There has been evidence that AMP pages rank higher than normal pages.

The truth.
John Muller has said that AMP pages are not a ranking factor, but that Google could use them to determine site quality, provided they were canonical pages.

My take.
For many sites, especially news sites, AMP is a good solution for mobile users, since it delivers content faster. Personally, I don’t implement AMP on my website or on client sites. Firstly, it is a pain in the butt to implement and second, it strips away certain components of a page that could be slowing it down, which could be important images.

17. CTR, Bounce Rate, Dwell Time

The myth.
Google takes into consideration CRT (Click Through Rate), Bounce Rate (the percentage of visitors who enter a site and then leave rather than continuing to view other pages within the same site), and Dwell Time (the amount of time that passes between the moment a user clicks a search result and subsequently returns back to the SERPs) to evaluate page quality, and hence ranking.

The truth.
Here’s what Google’s Gary Illyes said: “Dwell time, CTR… those are generally made up crap. Search is much more simple than people think.”

My take.
Even though Google doesn’t give a ranking boost for good CTR, bounce rate, and dwell time, it’s still a good idea to look at these metrics and try to improve them.

18. Direct Traffic

The myth.
Google has confirmed that it has been using data from Google Chrome to track traffic to websites. Based on this evidence, it is believed that Google favors sites that have a lot of direct traffic and ranks these sites higher.

The truth.
Google denies any connections between traffic and rankings. “No, traffic to a website isn’t a ranking factor.”

My take.
For many years I believed there was a correlation between direct traffic and ranking. Otherwise, why would Amazon have so many top rankings? But again, flawed reporting from the top SEO professionals. Google may be tracking direct visitors to website (they track everything) but it doesn’t mean they use that data in ranking websites.

19. User Reviews

The myth.
Google uses positive or negative reviews in ranking websites.

The truth.
If Google uses reviews for ranking, why wouldn’t it use just positive reviews for higher rankings and negative reviews for lower rankings?
The answer is in a Google patent called “domain-specific sentiment classification.” The document talks about how reviews might be used by Google. It states that a common way to measure reviews is to identify positive and negative words occurring and use those words to calculate a score. It goes on to say that there is a problem with this approach because it does not account for the sentiment expressed by the review’s words.

To quote Google: “For example the word “small” usually indicates positive sentiment when describing a portable electronic device, but it can indicate negative sentiment when used to describe the size of a portion served by restaurant.”

This is probably why John Mueller has denied that Google is considering ratings and reviews to ever been ranking factors.

My take.
User reviews on Google My Business do play an important role in local ranking by allowing you to rank higher in the Local Pack. It just doesn’t seem to affect organic rankings.

20. Links from .edu and .gov Domains

The myth.
Links to websites (called backlinks) from education and government websites are considered to pass more link juice (or weight) to the websites they refer to.

The truth.
Several years ago Matt Cutts stated that links from .edu and .gov domains carry no more weight than any other backlinks.

He went on to say: “It’s not like a link from an .edu automatically carries more weight, or a link from a .gov automatically carries more weight.”

My take.
I don’t know how this got started, but the SEO guru’s have repeatedly said that links from education and government domains carried more weight.

It may be the fact that these two domains have better quality and are more authoritative, which will give them more link juice, which Google gives to any website, not just from .edu and .gov sites.

21. Short URLs

The myth.
Google wants to see a search engine friendly URL (Uniform Resource Locator) that is short and to the point and gives a small ranking boost for these types of URLs.

The truth.
Google does want to see a simple URL structure. In Google’s General Guidelines it states, “A site’s URL structure should be as simple as possible. Consider organizing your content so that URLs are constructed logically and in a manner that is most intelligible to humans (when possible, readable words rather than long ID numbers). For example, if you’re searching for information about aviation, a URL like http://en.wikipedia.org/wiki/Aviation will help you decide whether to click that link. A URL like http://www.example.com/index.php?id_sezione=360&sid=3a5ebc944f41daa6f849f730f1, is much less appealing to users.”

Google goes on to say: “Overly complex URLs, especially those containing multiple parameters, can cause problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.”

My take.
Google has never said that short URLs is a ranking factor. But, based on what Google has said about this, if the Googlebot does not index the page for a complex URL, that can certainly impact rankings if that was an important URL.

If you are using WordPress, you should set your permalink structure to Post Name. Example: https://mywordpress.com/sample-post/ as opposed to https://mywordpress.com/?p=123.

I could elaborate much more on this, but this post is getting too long.

Summary

In discerning between truth and myth for SEO best practices, it is best to check with the source that is doing the actual ranking, Google. Many times we as SEO professionals tend to believe, without question, what the masses say without checking it out for ourselves.

Need help with ranking your website? Contact Blue Lacy SEO.

Blue Lacy SEO
El Paso, Texas
[email protected]
915-494-2382 or 915-471-9796

Related Posts