Posted on

Forbidden by robots.txt

Pope Tech crawl results depicting an error of Forbidden by robots.txt

Forbidden by robots.txt is the most common crawl error type. The Pope Tech crawler respects your robots.txt file. To get around this simply update the robots.txt file found on You can leave it to block other bots but add in a line to specifically allow the Pope Tech user agents.

Example robots.txt entry:

User-agent: PopeTech-CrawlBot

User-agent: PopeTech-ScanBot