Forbidden by robots.txt

Forbidden by robots.txt is the most common crawl error type. The Pope Tech crawler respects your robots.txt file. To get around this simply update the robots.txt file found on yourdomain.com/robots.txt. You can leave it to block other bots but add in a line to specifically allow the Pope Tech user agents. Example robots.txt entry: User-agent:…

Jan 14th, 2020 Update

This update includes the following: User setting for table columns that show Show crawl errors by default Search filter on User, Schedule, and Scan Detail views Additional table filtering throughout application New permission for managing roles Allow duplicates on website mass upload User setting for table columns that show Throughout the Pope Tech application on…

4k Project – State Rankings (2019)

State Rankings Rankings for web accessibility errors in Higher Education for each state and territory in the US. Errors include detectable errors and contrast errors found by the WAVE engine. The sample attempts to include every Higher Ed institution inside the US with .edu domain. Rankings were determined by calculating the median of each institution’s…