We use cookies to analyze how you use our site and show ads related to your preferences. By continuing to use our site, you agree our use of cookies, terms of use and privacy policy.
A tool to analyze and validate robots.txt files for syntax errors based on robots exclusion de-facto standards.
Robots.txt or the robots exclusion protocol (REP), is a text file webmasters create to instruct search engine robots how to crawl and index pages on their website. It is not mandatory but great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. Robots.txt needs to be placed in the top-level directory of a web server in order to be find by search engines.
Related Tools:Spider View|Google Banned Checker
Enter Path to Robots.txt File
-OR-
Enter Robots.txt File Contents (Priority)
User Agent (Optional)
URLs to Test (Optional)
Domain Tools
HTML Tools
Search Engine Tools
Text Tools
Miscellaneous Tools