Robots.txt parsing errors
Errors
List of errors when parsing the robots.txt file.
Error | Yandex extension | Description |
---|---|---|
Rule doesn't start with / or * | Yes | A rule can start only with the / or * character. |
Multiple 'User-agent: *' rules found | No | Only one rule of this type is allowed. |
Robots.txt file size limit exceeded | Yes | The number of rules in the file exceeds 2048. |
No User-agent directive in front of rule. | No | A rule should always follow the User-agent directive. Perhaps, the file contains an empty line after User-agent. |
Rule too long | Yes | The rule exceeds the length limit (1024 characters). |
Incorrect Sitemap URL | Yes | The Sitemap file URL should be specified in full, including the protocol. For example, https://www.example.com/sitemap.xml |
Invalid Clean-param directive format | Yes | The Clean-param directive should contain one or more parameters ignored by the robot, and the path prefix. Parameters are separated with the & character. They are separated from the path prefix with a space. |
Error | Yandex extension | Description |
---|---|---|
Rule doesn't start with / or * | Yes | A rule can start only with the / or * character. |
Multiple 'User-agent: *' rules found | No | Only one rule of this type is allowed. |
Robots.txt file size limit exceeded | Yes | The number of rules in the file exceeds 2048. |
No User-agent directive in front of rule. | No | A rule should always follow the User-agent directive. Perhaps, the file contains an empty line after User-agent. |
Rule too long | Yes | The rule exceeds the length limit (1024 characters). |
Incorrect Sitemap URL | Yes | The Sitemap file URL should be specified in full, including the protocol. For example, https://www.example.com/sitemap.xml |
Invalid Clean-param directive format | Yes | The Clean-param directive should contain one or more parameters ignored by the robot, and the path prefix. Parameters are separated with the & character. They are separated from the path prefix with a space. |
Warnings
List of warnings when parsing the robots.txt file.
Warning | Yandex extension | Description |
---|---|---|
You may have used an invalid character | Yes | The file contains a special character other than * and $. |
Unknown directive detected | Yes | The file contains a directive that isn't described in the rules for using robots.txt. This directive may be used by the robots of other search engines. |
Syntax error | Yes | The string cannot be interpreted as a robots.txt directive. |
Unknown error | Yes | An unknown error occurred while analyzing the file. Contact the support service. |
Warning | Yandex extension | Description |
---|---|---|
You may have used an invalid character | Yes | The file contains a special character other than * and $. |
Unknown directive detected | Yes | The file contains a directive that isn't described in the rules for using robots.txt. This directive may be used by the robots of other search engines. |
Syntax error | Yes | The string cannot be interpreted as a robots.txt directive. |
Unknown error | Yes | An unknown error occurred while analyzing the file. Contact the support service. |
URL validation errors
The list of the URL validation errors in the Robots.txt analysis tool.
Error | Description |
---|---|
Syntax error | URL syntax error. |
This URL does not belong to your domain | The specified URL does not belong to the site for which the file is parsed. Perhaps you entered the address of a site mirror or misspelled the domain name. |
Error | Description |
---|---|
Syntax error | URL syntax error. |
This URL does not belong to your domain | The specified URL does not belong to the site for which the file is parsed. Perhaps you entered the address of a site mirror or misspelled the domain name. |