Forbidden by robots.txt is the most common crawl error type. The Pope Tech crawler respects your robots.txt file. To get around this simply update the robots.txt file found on yourdomain.com/robots.txt. You can leave it to block other bots but add in a line to specifically allow the Pope Tech user agents.
Example robots.txt entry:
User-agent: PopeTech-CrawlBot Disallow: User-agent: PopeTech-ScanBot Disallow: