Getting started
Tasks (How do I...?)
- find and display my site's broken links
- locate a broken link
- limit my crawl using blacklisting / whitelisting
- export an xml sitemap
- ftp the xml sitemap to my server
- use canonical href to exclude duplicates from my xml sitemap
- find missing meta tags
- find duplicate content (same content, different url)
- test the load speed of a page and all of its elements
- Find the slow-loading element that's slowing a page down
- analyse my pages for occurrences of a chosen key word / phrase
- test the html validation of a page or all pages
- test a website which requires authentication
- run scrutiny on schedule
Reference
- Limitations
- Settings
- Blacklists / Whitelists (Do not check / Only follow / Do not follow)
- Number of threads
- Archive pages while crawling
- Timeout and Delay
- Ignore querystrings
- Don't follow 'nofollow'
- Pages have unique titles
- Preferences
- Limiting Crawl
- Location of Validator (Scrutiny feature)
- User agent string / spoofing
- Ignore leading / trailing slashes and mismatched quotes around urls
- Check for robots.txt and robots meta tag (Scrutiny feature)
- Advanced settings
- Authentication (Scrutiny feature)
- Custom header fields
- Checking local files
- Importing a list of links
- Sitemap
- SEO and page analysis
- Keyword / phrase analysis
- Analysing the load of all elements of a page
- HTML Validation (Scrutiny feature)
- Using the public instance (validator.w3.org)
- Installing and using your own instance of the w3c validator (free)
- Export options and the full report
- Running on schedule
- FAQs
- More information and download - Scrutiny
- More information and download - Integrity