screenshot
:
Optional Arguments

About the Website Readability Analyzer

What is Readability?
Readability refers to the ease in which a passage of written text can be understood. It is often used in assessing the suitability of a text for an audience. Some states even have requirements that legal documents and health care documents must met strict readability thresholds in order to be accessible to a wide audience. If you want to get your ideas across to the largest audience possible, it is worth spending some time thinking about readability.

Readability metrics, such as the Flesch-Kincaid and Gunning Fog index, are algorithmic heuristics used for estimating readability. Many work by counting words, sentences and syllables while others use lists of already scored words.

Keep in mind that readability is not a measure of writing quality and that these heuristics are only estimates of a passages readability.

This Readability Analyzer estimates the readability of a passage of text using the Flesch-Kincaid Reading Ease, Gunning Fog Index, Kincaid Grade Level, SMOG formula and Dale–Chall Score and Fry Reading Graph metrics. Which one's right for you? That will depend partially on your domain and writing style. Different readability metrics flag difficult words in different manners. For example, the Fog index considers words with more than three syllables difficult, where Dale-Chall has a list of easily recognizable words. We suggest using a few different samples of text and going with the metrics that more closely align with human evaluations.

Readability on the Web
Why is readability important on the web? Visitors will often leave a website within the first 10-20 seconds. As a website owner you have a very short window to convince someone your site is worth their time.

Readability can be thought of as a measure of how much energy someone would need to spend in order to read the site; the more difficult your site, the more energy someone has to spend reading it, the fewer words they will see within that first 10-20 seconds.

What readability level should you aim for? That will depend on what kind of website you have and what audience you want to attract. You'll want to aim for something that is accessible for the majority of your audiance, not just the average visitor. Keep in mind that the average US adult read at or below an 8th grade level.

About the Spider
In order for this tool to work, we must crawl the site or page you want analyzed. We do this with DatayzeBot, the datayze spider.

Our spider crawls at a leisurely rate of 1 page ever 1.5 seconds. While the spider doesn't keep track of the contents of the pages it crawls, it does keep track of the number of requests issued by each visitor. Currently the crawler is limited to 1000 pages per user per day. Since the DatayzeBot does not index or cache any pages it crawls, rerunning the Website Readability Analyzer will count against your daily allowed number of page crawls. You can get around the cap by pausing the crawler and resuming it another day.

DatayzeBot now respects the robots exclusion standard. To specifically allow (or disallow) the crawler to access a page or directory, create a new set of rules for "DatayzeBot" in your robots.txt file. DatayzeBot will follow the longest matching rule for a specified page, rather than the first matching rule. If no matching rule is found, DatayzeBot assumes it is allowed to crawl the page. Not sure if a page is excluded by your robots.txt file? The Index/No Index app will parse HTML headers, meta tags and robots.txt and summarize the results for you.