Forbidden by robots.txt

Forbidden by robots.txt is the most common crawl error type. The Pope Tech crawler respects your robots.txt file. To get around this simply update the robots.txt file found on yourdomain.com/robots.txt. You can leave it to block other bots but add in a line to specifically allow the Pope Tech user agents. Example robots.txt entry: User-agent:…

Mass Upload Websites

Summary: Users can upload websites in bulk via the Mass Upload function. Mass Upload will enable users to quickly add large portfolios of websites into Pope Tech. It is recommended that you use our CSV file mass website import template and not modify the header row or add or remove columns. When to use When…

Organization Default Crawler Settings

The defaults for two crawler settings, “Max Pages” and “Max Depth” can be set at the organization level. Once these are set, all new websites created will have this default. Setting new organization crawler defaults will not override any existing individual website’s crawler settings. You can adjust individual websites crawler settings in that website’s settings.…

Getting Started

This article covers how to create your web accessibility baseline by crawling and scanning a website and setup a recurring accessibility cadence with monthly automated reports.