A 404 page, or error page , is the content a user sees when trying to reach a non-existent page on our website. This is the page your server displays when it cannot find the URL requested by the user.
“404 Oops, this page not found! “ Everyone has come across this type of message when browsing a website online.
The 404 code stigmatizes a URL that is not (or no longer) available , explaining to the visitor that his search cannot be successful, for lack of corresponding content. This type of error can be the result of the deletion, an error occurring after obtaining a backlink pointing to an incorrect page, a page generated by error by your CMS (WordPress, Joomla, etc.), a change in domain name, a URL that has changed following a site migration or an SEO redesign, or simply an input error on the part of the user.
Whatever the reason for the appearance of this message, it is important to intervene and remedy the problem or risk seeing the traffic of your site decrease . More than an SEO problem, it is above all a problem altering the user experience.
The impact can be penalizing both for the user experience and for the crawl budget which is spent on inaccessible pages.
Here are some tips on what work to do in these situations.
The 404 code comes from the http protocol and indicates that a resource is unavailable . The page that then appears is a message from the server meaning that the requested content cannot be found. The requested url does not exist or no longer exists.
It can be the result of a modification of the content which would have generated a new url, the deletion of a page or content, or even the bad writing of a link pointing to your site. In any case, the message is without appeal and leaves the visitor with an untraceable resource. It's a safe bet that the user in question will go his way and shorten his visit to your website.
At the level of search engines (Google robots in particular), the latter will try to crawl the resource in question, ending up in a dead end and exploration errors, for lack of corresponding content . The robots will thus pass in vain, wasting part of the crawl budget allocated to your pages and their indexing. However, the more we facilitate the crawl of bots, the more they tend to favor the indexing of the site in question.
Better still, they visit the site on a regular basis, thereby ensuring that any updates made are processed frequently enough to be detected. Otherwise, the bots will tend to pass less often in order to avoid spending their crawl budget unnecessarily within your internal network.
Between the user experience, which can suffer from this type of problem , and the impossibility for the engines to serve Internet users with the desired content, your site may be penalized by the presence of 404s en masse. Your referencing and your positioning can thus be affected by this type of anomaly.
In order to easily find 404 errors emanating from your site, several tools will allow you to detect them.
One of the features of this tool allows you to point out the error messages sent to Internet users when they search within your pages.
By analyzing the “Cover” section, a line dedicated to 404 errors appears. You will find there the number of urls going up this message. You will then be able to analyze and correct any exploration errors detected.
Take care to refresh your session to ensure that you are only confronted with effective 404s, and not with pages to which you have already corrected. To do this, simply report that each of the errors found has been corrected and wait for the search console to report the 404s still present.
Again, this is a tool made available by Google to analyze your website in depth.
By going to the “Exploration“ tab, you will be able to analyze the errors during the last crawls . You will be able to export all the urls generating a 404 as an Excel file. Pay attention to the detection date. Indeed, urls may appear to which you have already made changes and which are therefore not to be taken into account.
Anyway, you will have a relevant working basis to remedy the error problems.
Screaming frog allows you to crawl your site and analyze the response code of each resource on your website . By choosing to upload only the pages returning a 404 error, you can export a list in the form of an Excell file which will allow you to easily detect and analyze your pages.
Note : analyzing your site using the search console or Google Webmaster tool does not exempt you from performing a crawl of your site. It may be that a tool like screaming Frog or Xenu allows you to detect more “culprit” urls than the tools made available by Google.
There are several ways to remedy problems caused by unavailable resources.
One of the ways to ensure that the error message disappears is to revive the url that brings it up.
This method is not always applicable, however, when it is actually possible to recreate it, it may be interesting to do so. During the next crawls, the Google bots will detect the new content and will no longer consider the url to be in error. Your site will thus benefit from the favors of the bots, as well as your SEO.
If this opportunity is applicable within the framework of your site or your niche, do not hesitate to implement it.
The other way to ensure that the user experience will not be tainted while exploring your site is to set up 301s, or permanents.
The 301s are intended to send a signal to search engines. They prevent the bots from locking themselves in a dead end (404) and continuing to crawl the site .
In this case, it is a good idea to favor a redirection to relevant content, close to the url returning an error. Thus, it is preferable, for example, to send them to a close category, or to content related to the one concerned.
The effect will be all the more beneficial for your natural referencing, just as the Internet user will ultimately be sent to content likely to meet his expectations.
It is possible for webmasters to create one towards the main page of your site , but in the case of too many 404 errors, this method may prove to be of little relevance, or even penalizing for the user experience. . It is therefore preferable to redirect visitors to a page likely to interest them and to respond to their request by avoiding the rebound.
Several methods are possible to create the redirections necessary to correct your 404 pages and redirect robots and traffic to the right pages.
It will be possible to intervene directly on the server in order to signify these redirections. To do this, you will need to access your .htaccess file or the Apache configuration file to implement a command similar to this:
This type of practice is the same as when you want to create a redirect from one domain to another. Applicable to all types of resources, this helps ensure that search engines will grab it quickly and easily.
In this case, we will set up an http header which will be interpreted by the engines and will lead them to take into consideration the destination url and not the one showing a 404 error.
header("Status: 301 Moved Permanently", false, 301);header("Location: http://www.exemple.fr/repertoire/page.php");exit();?>
To ensure that the bots will take this type of redirect into account, be sure to place this code at the start of your page. The taking into account will only be faster and more efficient.
There are “turnkey” solutions when using certain CMS . For WordPress, for example, plugins have been developed to facilitate the implementation of redirections.
Simply fill in the url of the page in 404 then fill in the one to which the navigation must be redirected. The use of a plugin of this type does not require any particular skills, it is the simplest way to remedy this type of problem, especially if you do not have access to the server to make the modifications. required.
In addition to remedying the problem of deleted or erroneous URLs, it still happens that a user makes a mistake when writing an address , thus finding themselves faced with an error.
Among the many guidelines issued by Google, the creation of a personalized 404 error page appears. Indeed, rather than leaving the user with a generic page, it may be much more interesting to use this page.
By creating a page in line with the rest of your website (graphic identity, menu, etc.), the user will be less confused by the message. We can also take the opportunity to display internal links to pages likely to interest him and thus lead him to continue browsing without leaving your site.
Clearly, the 404 error page can be seen as an opportunity to offer alternative navigation paths to visitors .
One of the important points to analyze when looking for any errors returned by your site is netlinking.
It would be a shame if a site trying to link to your site would actually refer to a page that does not exist. In this case, the effect of the backlink in question would be zero.
It may therefore be useful to check from time to time that third-party sites that have created backlinks to your pages are leading to functional urls. If this is not the case, not having control over the sites in question, it will be interesting to get in touch with the people in charge of these sites to correct the situation. Think about it, your netlinking could thus be optimized.
If these errors do not directly penalize your natural referencing , they can harm the user experience. In this case, the bounce rate observed by Google on your pages could be interpreted as a negative signal, showing that your content is not of good quality.
The damage seen over time leads to a loss of credibility and relevance in the eyes of Google, ultimately costing you positions on search engine results pages.
So be sure to regularly check for any 404 error and make sure you fix it quickly. In good SEO, this type of verification will take you very little time, and the repair of the problems identified will be done in the blink of an eye.
However, if you do not have the skills required to detect and repair these problems , do not hesitate to contact us, we will take care of restoring your site to perfect condition and giving you the appropriate advice. We are also involved in all delicate operations that may generate loss of traffic or SEO positioning (redesign, migration, etc.).