Fatal errors

This section contains solutions to common problems in the “Fatal” category that are detected during website troubleshooting in Yandex.Webmaster. Problems in this category cause individual pages or the entire site to be excluded from search results.
Tip. Track and fix errors as soon as possible. You can configure notifications for site monitoring results.
  1. Main page returns an error
  2. Failed to connect to the server due to DNS error
  3. Site indexing is disallowed in the robots.txt file
  4. Security problems or violations detected

Main page returns an error

If you receive this message, follow these steps::
  1. Check for the presence of the noindex element in the HTML code of the page. If you find it, delete it.
  2. Go to the Indexing → Crawl statistics page in Yandex.Webmaster and see what response code the page returned to the Yandex robot's requests in the Currently column. If the response code differs from 200 OK, check whether the problem is current using Server response check.

  3. When checking the server response, pay attention to the Page content section. If the message “Missing page content” is displayed, check your server settings:

    • HTTP header. For example, if it contains "Content-length: 0", the robot will fail to index the page.
    • The size of the page content. It must be greater than 0 and match the HTTP header.

After you make changes, wait until the robot crawls the site again. You can also send the page for reindexing and the robot will recrawl it within two weeks.

If the page responds with the 200 OK code as a result of verification and there are no problems with the availability of content, then the warning in Yandex.Webmaster will disappear within a few days.

Failed to connect to the server due to DNS error

Once a day, the indexing robot accesses DNS servers to determine the IP address of the server where the site is located. If DNS records are incorrectly configured on the site, the robot does not get the IP address of the site. As a result, the site cannot be indexed and added to search results.

Check the correctness of the server response to the indexing robot. If the site is still unavailable, contact your hosting provider to correct your domain's DNS records. After access to the website is gained, information in Yandex.Webmaster is updated within a few days.

The problem with accessing the site may be short-term. If you do not find any errors when re-checking the server response, wait for the information to be updated in Yandex.Webmaster. This should happen within a few days.

If the domain name was not renewed in time, the site becomes unavailable for indexing. Renew the domain registration. After this, the message in Yandex.Webmaster will disappear within a few days.

Site indexing is disallowed in the robots.txt file

The indexing robot requests the robots.txt file several times a day and updates information about it in its database. If the robot receives a prohibiting directive in response to a request, a warning appears in Yandex.Webmaster.

Check the robots.txt file contents. If the prohibiting directive is still present, delete it from the file. If you can't do this yourself, contact your hosting provider or domain name registrar. After the directive is removed, the data in Yandex.Webmaster is updated within a few days.

If the domain name was not renewed in time, the robots.txt file will contain prohibiting directives for the robot. Renew the domain registration. After this, the message in Yandex.Webmaster will disappear within a few days.

Security problems or violations detected

If a search rule violation is found

See the description of the violation and recommendations for correcting it.

If a security threat is found

Go to Diagnostics → Security and violations and do the following:

When you solve the problem:

  1. Make sure the problem is fixed. If the service re-detects a threat during verification, you will not be able to report fixes for a month. After that, this period will increase up to a maximum of three months.
  2. In Yandex.Webmaster, go to the Troubleshooting → Security and violations page and click I fixed it. This will give an additional signal to the Yandex algorithms that the site needs to be rechecked. If the check is successful, the restrictions will be lifted over time and information about violations will no longer be displayed.

Tell us what your question is about so we can direct you to the right specialist:

If you followed the recommendations and the site is available for the robot, but the error still appears in Yandex.Webmaster, fill out the form:

If you followed the recommendations, but the error still appears in Yandex.Webmaster, fill out the form: