screenshot

About the Website Validator

What are the most common errors on your website? The Website Validator crawls a website, runs the contents through an W3 HTML Validator, summarizing the content for you. The Website Validator will break down errors and warnings by type, as well as give you the overall percentage of error free pages. In the Error Summary tab you'll find a detailed description of all the errors encountered on your site. Click over to the Pages With Errors tab to start tackling those problem spots. Keep in mind not all errors in HTML markeup are equally problematic, and many errors have little to no impact on how the page is displayed to the user.

This tool uses the webservices of W3C Markup Validation Service and the HTML5 Nu HTML Checker. We use these multiple validators to reduce the load on any one validator. It's possible these two different validatos are running two different Nu HTML Checker Instances and thus may report different errors, even when validating the same page.

We request that you be kind to the 3rd party services. When making changes to a URL, use the "See Validation" on the Pages With Errors tab to check just that page rather than rerunning the Site Validator over your entire site again. When making changes to a template shared between pages, we recommend verifying the template is error free by validating a single page that uses the template first before rerunning the Website Validator. Our spider has a 400 daily page cap, these services may have their own caps.

About the Spider To borrow a term from dungeons and dragons, the datayze spider is a Neutral Good spider. This tool is designed for website engineers wanting to improve their sites' navigability, and possibly improve their search engine rankings. Since it's assuming the webmaster of each domain is initiating the crawl request, it crawls each and every page it can find. The spider does not store the content of the pages it crawls. Effectively the spider treats each page as "FOLLOW, NOINDEX" regardless of the robots.txt file or robots meta tag. Hence the lack of a lawful good alignment.

We do recognize the possibility for abuse, and put several safe guards to ensure our spider remains on the Good side of the alignment chart. Our spider crawls at a leisurely rate of 1 page ever 1.5 seconds. While the spider doesn't keep track of the contents of the pages it crawls, it does keep track of the number of requests issued by each visitor. The spider will crawl no more than 400 pages for a given visitor on a given day.

Interested in Web Development? Try our other tools, like the Site Navigability Analyzer, which can let you see what a spider sees. It can analyze your anchor text diversity and find the length shortest path to any page. The Thin Content Checker can analyze your site's content, let you know the percentage of unique phrases per page, and generate a histogram of page content lengths.

About · Terms of Use · Privacy Policy · Contact