Robots.txt Testing Tool and ValidatorThe tool attempts to mimic the behaviour for Googlebot and Yandex and allows you to check whether URLs crawling is allowed or disallowed for search engine robots in the robots.txt file. Please, enter URLs (one per line) to test if they are allowed, or blocked and if so, by what rule in robots.txt. URLs should start with http / https protocol, for URLs without a protocol, the https protocol will be used. You can use different hosts' urls and they will be grouped by host in the results of the check.

URLs count: 0
Test URLs