We help you understand why your SEO strategy might not be working
If you are applying an SEO strategy, but it is not working as expected, it is not necessarily the fault of the algorithm. It is possible to have an excellent content strategy, with all the necessary optimizations, but that your website does not finish positioning. Do you know why this happens? Because in most cases, we forget about technical SEO .
You see, there are approximately 200 factors that Google considers when ranking search results. And while many are related to content and domain authority, much of it is down to user experience. Speed, design, usability and other factors that manage to retain the attention of users for longer.
Good content and web improvements go hand in hand, one does not position without the other.
Next, we will explain what technical factors you should constantly review to maintain the good SEO health of your website. The improvements can be applied by your webmaster or technology team and usually do not take much time.
As a basis, we will use the Semrush study applied to 310 million pages , of 250 thousand web projects.
HTTP status codes are your server's response to a browser request. The 4xxx means there is a syntax error or the request could not be completed. This error usually occurs in 47.3% of cases.
42.4% of the analyzed sites lost positioning due to having broken internal links. This happens when we redirect from one page to another that no longer exists. This indicates to Google that constant maintenance is not performed.
It is an error similar to the previous one, but taking into account the URLs that link to external domains. This affected 46.5% of websites.
It usually occurs when the server response time is greater than 5 seconds or access to the web page is denied. This error was detected in 29% of cases.
It happens when the image file no longer exists or the URL doesn't work. This was found in 16% of the projects analyzed by Semrush.
The status of temporary and permanent redirects also affects organic ranking.
Meta tags are signals that tell search engines what your content is about. You may know them better as SEO title, meta description, and image ALT attributes. All these fields must have well-defined keywords to give a clear and direct message to Google.
When these factors are not optimized, they are automatically generated by the system, causing duplication and matching issues that negatively affect your SEO.
Your content must have original titles and descriptions, replicas often confuse the search engine algorithm during the ranking process. The error is so common that it was found on 50.8% of the webpages tested by Semrush.
Error present in 64% of cases. H1 tags are the main title of your note and it is one of the most important for Google. Without it, they can't properly understand your content.
If your website has a very low CTR, it probably has to do with the lack of meta descriptions. This is usually seen by the user in the SERPs (search results page) and has a big impact on the click-through rate. As a consequence, it impoverishes other indicators of your website that are important during the ranking of content. The lack of meta descriptions caused a loss in the ranking of approximately 67% of the websites.
In order for Google to understand the images on your website, it is important to place keywords in their ALT attributes. Otherwise they will lose relevance.
Duplicate content can be very detrimental to your organic ranking. And it can happen in blogs, descriptions and/or meta tags.
For example, it can happen if your web page handles content in two languages, for example. In these cases, there are very fast and effective solutions like integrating the rel=canonical code or creating a 303 redirect.
This error affected more than 50% of the web pages considered in the study.
There are two very common SEO errors related to the sitemap.xml, the first and most important is: not having a sitemap.xml.
Sitemaps.xml are, as the name implies, a map of your website. They help Google to understand the structure of your project and it greatly favors SEO. In fact, it should be included in your robots.txt file
Despite its importance, more than 17% of websites do not have it. And if they do, they are possibly making the second most common mistake: having broken links within the sitemap.xml
This failure is very frowned upon by Google, because the algorithm itself highly values web updates and maintenance. This tells you that the website is constantly being improved to please users.
72.3% of web pages were found to be low in word count, which to Google means the content might not be of quality.
One of the best known SEO issues is that the texts must have at least 300 words. And it is probably not strictly true that the number of words defines the quality of the article, but in a context where another million pages offer the user the same information as you, but more detailed or better explained, Google begins to do its tests.
According to Hubspot's latest research (2020), the majority of articles on the first page of Google average 1,500 words. And that tells us a lot about user preferences.
So, always try to make your content competitive. It's not about writing for writing's sake, either. Offer quality in your content and consider the tastes of your audience.