Improve your website's quality, SEO and user experience
- Faster than ever - v4 handles larger sites quickly and without slowing down
- Scan your site, checking for internal and external links as well as images
- Many options, can be scheduled
- Test the load of a single page to find slow elements or measure total weight Read more Try the page analysis tool online
- See uncompressed and compressed size of files and so easily see the benefit of the server's gzip service
- Generate and ftp an XML sitemap conforming to the standard protocol for submission to search engines
- Check your html using the w3c html validator; the public instance or your own installation
- Highlight SEO (search engine optimisation) issues such as missing page title, meta description, headings and keywords. Now highlights duplicates (different url, same content)
- Many export options including full report, sitemap xml, graphic visualisation
- Manage as many sites as you like with different settings for each
- See progress via dock icon
- Used by large corporations and individuals
- In short, improve your website's quality and search engine ranking
If you need a licence for the purposes of reviewing Scrutiny for a magazine or blog article, please contact me
Web download: Mac OSX 10.4 or higher, Intel or ppc. 10.8 (Mountain Lion) is supported and the app is now code signed to help keep 10.8's Gatekeeper happy.
Open the .dmg file and find an installer inside. To keep Scrutiny in your dock, right-click or click-and-hold on its dock icon and choose 'Keep in dock'.
"to say I am impressed is an understatement, really great tool I will be using a LOT"
- Peter B
"Thanks for the great tool, I think webmasters who don't need advanced options ... can make great use of it to spot main on page problems and remove obstacles to high rankings!"
- from review by Singley via Macupdate
"..it is what we were looking for and I believe it is the best link checker available for Mac OS."
- D H
- Integrity / Scrutiny Manual
- Integrity / Scrutiny FAQs
- Making best use of Scrutiny's SEO and keyword analysis
- Integrity / Scrutiny bug and enhancement list
- Scrutiny's home page
- The previous stable version, v4.0.4 is available here
If your question isn't answered, please use Scrutiny's support form
Developer: Shiela Dixon
v4.1.2 (Released May 13)
Improves authentication: allows you to input field names for websites which require login details to be sent by web form (eg Wordpress sites)
Remembers last-used filename and directory when saving sitemap xml file - details are remembered for each of your sites
Ignores and continues if 'bad SSL certificate' warning is encountered. But only for the website being tested. (anything else, ie external links, won't be followed anyway)
If a link just has a hash as the url, a hash character is displayed rather than the word 'hash' to avoid confusion
Improves speed of csv exports
v4.1 (Released April 13)
Now able to search for duplicates (same page with different urls)
Checks whether links are 'nofollow', displays this information in the link tables (switchable as per other columns) and adds option to prefs to 'not follow nofollow links'
Also checks for robots meta tag and whether nofollow present. If so the new 'don't follow nofollow links' will also apply to links on that page
Adds selector allowing choice of highlighting in SEO table; missing SEO parameters (as before) , possible duplicates or pages marked as nofollow
Double-click in SEO table now opens a new inspector showing SEO information including a list of possible duplicates
Scrutiny will check for the nofollow attribute - which is an overhead - if either of the columns are showing (Preferences > Views). So that you can see which links are 'nofollow' even if you've chosen not to not follow them. Hide both columns (which is the default global setting) if you don't need to know about this, then it won't slow the crawl down
Fixes page analysis not working properly
Checking for blacklist or whitelist terms is now case-insensitive, as you would probably expect
If flagging blacklisted urls, then the highlight colour used is orange or the warning colour (was red or bad link colour). Not an error so inappropriate to use an error colour
No longer includes 404 pages in the sitemap
Fixes problem of apparent duplicates in sitemap and SEO tables caused by two different link urls redirecting to the same url
Fixes bug preventing total image weight being shown in SEO table
More context help buttons
v4.0.4 (Released February 13)
Fixes problems creating black/whitelist rules on first run with no settings saved
Correctly sets window to edited (dirty spot in red button) when black/whitelist rules are changed, triggering prompt to save when switching settings
v4.0.3 (Released February 13)
v4 (Release Candidate January 13)
Major improvements to the engine and data storage meaning that even small sites will crawl more quickly and large sites will crawl very much more quickly without slowing down or losing responsiveness
When stop button is pressed, all open threads are abandoned, and then recreated if 'continue' is pressed. Gives a much better user experience.
Blacklist and whitelist boxes replaced by a more user-friendly table of rules (existing data will be preserved and presented in the new way)
Adds 'By page' links view. If 'bad links only' are showing, the view will show a list of pages requiring attention, expanding to show the bad links on that page.
Routines for 'by page' view re-written to avoid apparent hanging at the end of the crawl of a big site
Adds new settings to Preferences, allows setting of limits - default to 200,000 links. Offering the option of limiting the crawl of a large site (maybe better achieved by using blacklist / whitelist rules) but also a safety valve to prevent crashing due to running out of resources when crawling very large sites
If starting crawl within a directory, crawl is limited to that directory, ie crawl will go down a directory structure but not up. This matches users' expectations. Previously, crawl extended to all pages in the same domain.
Blacklist and whitelist boxes replaced by a more user-friendly table of rules (existing data will be presented in the new way)
Fixes inefficiencies in full report generation which were giving the impression of 'hanging' if full report generated for medium or large sites
Fixes problem with robots.txt if more than one user-agent is specified. Now will only use an exclusion list for user-agent = all (*) or Google (ie Scrutiny will respect the file as if it were Googlebot)
Moves 'check links on custom error pages' to settings rather than global preferences, and moves the 'labels' preferences to the View rather than General tab of the preferences window
Adds Help contents to help menu - links to manual index page
Increases maximum number of threads from 30 to 40 (will improve crawling for some sites) with the default now 12 rather than 7. Extreme left (labelled 'fewer') is still a single thread
Updated application icon
Resets the 30-day trial if you've used the trial with a previous version. There will be a price increase but existing licences will work with v4. This is a thank you to those who have bought in early
Change of price. Existing licences will work with the new version - a thank you to them for buying in early.
Full version history for Scrutiny (The full history is also in the release notes included in the app's dmg file)