Checklist - WebSite Auditor
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
View only
 
 
ABCDEFGHIJKLMNOPQR
1
FeatureExplanationDoes the Website Auditor support a feature?
2
Basic SEO reports
3
List of indexable/non-indexable pages
It's necessary to view a list of indexable / non indexable pages to make sure there are no mistakes. Maybe some URLs were intended to be indexable?
Yes
4
Missing title tags
Meta titles are an important part of SEO audits. A crawler should show you a list of pages that have missing tags.
Yes. Go to "Site audit" -> "Empty title tags"
5
Filtering URLs by status code (3xx, 4xx, 5xx)
When you perform an SEO audit, it's necessary to filter URL by status code. How many URLs are not found (404)? How many URLs are redirected (301)?
Yes
6
List of Hx tags
“Google looks at the Hx headers to understand the structure of the text on a page better.” - John Mueller
You can view the list of Hx tags only in a detailed report for a given URLs
7
View internal nofollow links
It's nice to see internal nofollow list to make sure there aren’t any mistakes.
Yes. Go to Site structure -> "Pages" -> "Links & technical factors". Review the "Links to page" section.
8
External links list (outbound external)
A crawler should allow you to analyze both internal and external outbound links.
Yes. Go to "Site Audit" -> "Dofollow external links"
9
Link rel="next" (to indicate a pagination series)
When you perform an SEO audit, you should analyze if the pagination series are implemented properly.
No
10
Hreflang tags
Hreflang tags are the foundations of international SEO, so a crawler should recognize them to let you point to hreflang-related issues.
No (however, you can do custom extraction)
11
Canonical tagsEvery SEO crawler should inform you about canonical tags to let you spot indexing issues.
Yes. Go to Site structure -> Site Audit -> Pages with rel="canonical"
12
Information about crawl depth - number of clicks from a homepage
Additional information about crawl depth can give you an overview of the structure of your website. If an important page isn’t accessible within a few clicks from a homepage, it may indicate poor website structure.
Yes
13
Content analysis
14
List of empty / Thin pages
A large number of thin pages can negatively affect your SEO efforts. A crawler should report them.
Yes, based on the word count. Go to "Site structure" -> Pages and add a new column: "Word count".
15
Duplicate content recognision
A crawler should give you at least basic information on duplicates across your website.
Yes. Click on Site structure -> Site audit. Then review the following sections: "Duplicate titles" and "duplicate meta descriptions"
16
Convenience
17
A detailed report for given URL
It's must-have! If you do a crawl of a website, you may want to see internal links pointing to a particular URL, to see headers, canonical tags, etc.
Yes
18

Advanced URL filtering for reporting - using regular expressions and modifiers like "contains," "start with,” "end with."

I can't imagine my SEO life without a feature like this. It’s common that I need to see only URLs that end with “.html” or those which contain a product ID. A crawler must allow for such filtering.
Yes (but it doesn't support regular expressions)
19
Adding additional columns to a report


This is also a very important feature of crawlers. I simply can't live without it. When I view a single report, I want to add additional columns to get the most out of the data. Fortunately, most crawlers allow this.
Yes
20
Page categorizing
Some crawlers offer the possibility to categorize crawled pages (i.e. blog, product pages etc) and see some reports dedicated to specific categories of pages.
No
21
Filtering URLs by type (HTML, CSS, JS, PDF etc)


Crawlers visit resources of various types (HTML, PDF, JPG). But, usually you want to review only HTML files. A crawler should support this.
Yes. Go to "All resources"
22
Basic statistics about website structure - ie. Depth stats, Yes. Go to Site Structure -> Visualization section
23
Overview - the list of all the issues listed on a single dashboard


It's a positive if a crawler lists all the detected issues on a single dashboard. Of course, it will not do the job for you, but it can make SEO audits easier and more efficient.
Yes. Go to Site structure -> Site audit
24
Comparing to the previous crawl
When you work on a website for a long time, it’s important to compare the crawls that were done before and after the changes.
No
25
Crawl settings
26
List mode - crawl just the listed URLs (helpful for a website migration)
Sometimes you want to perform a quick audit of a specified set of URLs without crawling the whole website.
Yes
27
Changing the user agent
Sometimes, it's necessary to change the user agent. For example, even when a website blocks Ahrefs, you still need to perform a crawl. Also, more and more websites detect Googlebot by user agent and serve it a pre-rendered version instead of fully equipped JS.
Yes
28
Crawl speed adjusting

You should be able to set a crawl speed i.e 1-3 URLs per second if a website can't handle host load, while you may want to crawl much faster if a website is healthy.
Yes
29
Can I limit crawling? Crawl depth, max number of URLs
Many websites have millions of URLs. Sometimes, it's good to limit the crawl depth or specifying a max number of URLs allowed to crawl.
You can only specify the crawl depth; you can't limit the number of URLs to be crawled
30
Analyzing a domain protected by an htaccess Login
(helpful for analyzing staging websites)

This is a helpful feature if you want to crawl the staging website.
Yes
31
Can I exclude particular subdomains, include only specific directories?

Yes, you can exclude/include URLs containing a particular strings
32
Universal crawl -> crawl + list mode + sitemapNo
33
Maintenance
34
Crawl scheduling
It's handy to be able to schedule a crawl and set monthly/weekly crawls.
Yes (Go to Preferences -> Scheduler)
35
Indicating the crawling progress
If you deal with big websites, you should be able to see the current status of a crawl. Will you wait a few hours, or weeks till the 1kk+ crawl will finish?
Yes
36
Robots.txt changes monitoring
Accidental changes in robots.txt can cause Google to not be able to read and index your content. It's beneficial if a crawler detects changes in Robots.txt and informs you.
No
37
Crawl data retention
It’s good if a crawler can store results for a long period of time.
Forever. Crawl stats are saved in a project file and is available forever, even if the subscription is expired
38
Notifications - crawl finished
A crawler should inform you when a crawl is done (desktop notification / email).
Yes
39
Advanced SEO reports
40
List of pages with less than x links incoming
If there are no internal links pointing to a page, it may mean for Google that the page is probably irrelevant. It’s crucial to spot orphan URLs.
Yes. Go to Site structure -> Pages and sort the results by "Links to page".
41
Comparison of URLs found in sitemaps and in crawl.Sitemaps should contain all the valuable URLs. If some pages are not included in a sitemap, it can cause issues with crawling and indexing by Google.
If a URL is apparent in a sitemap, but can't be accessible through crawl, it may be a signal to Google that a page is not relevant.
Yes. You should enable the "Search for orphan pages" feature while setting a crawl.
42
Internal Page Rank valueAlthough any PageRank calculations can’t reflect Google’s link graph, it’s still a really important feature. Imagine you want to see the most important URLs based on links. Then you should sort URLs by not only simple metrics like number of inlinks, but also by internal PageRank. You think Google doesn’t use PageRank anymore? http://www.seobythesea.com/2018/04/pagerank-updated/Yes
43
Mobile audit
In mobile-first indexing it’s necessary to perform a content parity audit between the mobile and desktop versions of your website
You can crawl with the mobile User Agent
44
Additional SEO reports
45
Malformed URLs (https://https://, https://example.com/tag/someting/tag/tag/tag or https://www.example.com/first_part of URL You can do it partially. Go to URL Site Structure -> Pages and try the following filters: "Page contains space", "Page contains `https://https` ".
46
List of URLs with parametersYes. Go to Pages -> All pages -> add a new filter: contains: "?"
47
Mixed content (some pages / resources are served via HTTPS, some by HTTPS)Yes. Site Structure - Site Audit - HTTP pages with mixed content issues
48
Redirect chains report
Nobody likes redirect chains. Not users, not search engines. A crawler should report any redirect chains to let you decide if it's worth fixing.
Yes. Go to "Site structure" -> "Site audit" -> "Pages with long redirect chains". Question for the representatives: can i see first, last URL?
49
Website speed statistics
Performance is becoming more and more important both for users and SEO. So crawlers should present reports related to performance.
Yes
50
List of URLs blocked by robots.txt
It happens that a webmaster mistakenly prevents Google from crawling a particular set of pages. As an SEO, you should review the list of URLs blocked by robots.txt - to make sure there are no mistakes.


Yes. Go to Site Structure -> Site Audit -> Resources restricted from indexing and filter the results by "Robots instructions: `Disallowed (Robots.txt)` "


51
Schema.org detectionThere is just a basic information on if schema.org was implemented or not. http://take.ms/Bx3sA
52
Export, sharing
53
Exporting to excel / CSV
Sometimes a crawler has no power here and you need to export the data and edit it in Excel / other tools.
Yes
54
Exporting to PDFYes
55
Custom reports / dashboards

Yes. Go to the "Reports" tab
56
Sharing individual reports
Imagine that you want to share a report related to 404s with your developers. Does the crawler support it?
Yes. You can customize your report and share it.
57
Granting access to a crawl for another person


It's pretty common that two or more people work on the same SEO audit. Thanks to report sharing, you can work simultaneously.
No. You can export a crawl and your collegues can open it in their Website auditor copy + you can export a crawl summary and get a shareable link to it.
58
Miscellaneous
59
Explanation on the issues
If you are new to SEO, you will appreciate the explanation of the issues that many crawlers provide.
Yes
60
Custom extraction
A crawler should let you perform a custom extraction to enrich your crawl. For instance, while auditing an e-commerce website, you should be able to scrape information about product availability and price.
Yes
61
Can crawler detect the unique part - that is not a part of the template? It’s valuable if a crawler let you analyse only the unique part of a page (excluding navigation links, sidebars and footer).
No
62
Ability to use the crawler's APINo
63
Supported operating systemsWindows, Linux, Mac
64
Integration
65
Integration with Google AnalyticsYes
66
Integration with Google Search ConsoleYes
67
Integration with server logs No
68
Integration with other toolsIntegration with Moz API
69
JavaScript rendering
JavaScript is more and more popular. If your website depends heavily on JavaScript, it's a good idea to use a crawler that supports JS.
Yes
70
Why do users should use WebSite Auditor?"There's quite a number of powerful web crawlers on the market, and we're happy to be in this competitive and highly-professional space, which constantly inspires us for excellence.

We've analyzed the feedback of our loyal customers to identify several common points, which make the SEO professionals choose WebSite Auditor over other web crawlers:

- Loads of conveniently organized crawl data. WebSite Auditor provides abundant data for technical analysis: crawlability and indexing, redirects, code, internal linking, images, mobile friendliness, and more. Most importantly, this data is logically clustered and visually rich. In comparison with some other popular tools, there’s no need to switch between tabs and columns to collect the required data and further accumulate it into a spreadsheet. With WebSite Auditor, you have it all in one place, ready to make informed SEO decisions.

- Dedicated content analysis. WebSite Auditor lets users optimize title tags, meta descriptions, headings, alt tags, and more. Plus, all the keyword-based calculations are built on the advanced TF-IDF (term frequency - inverse document frequency) algorithm.

- Visual website structure maps. Beautiful, interactive site visualizations help WebSite Auditor users to spot problems in their site architecture, analyze internal link juice, and explain errors to clients.

- Customizable, good-looking reports. While all the tools on the market (including WebSite Auditor) will serve you excellent CSV files with the necessary data, WebSite Auditor's reports are the choice of many SEO professionals as they make the SEO data speak for itself. According to the feedback we get, WebSite Auditor reports are way better to impress clients and teammates, as they're vocal, beautiful and fully customizable.

- Agency work: easy price setting for new clients. We hear it often from agency folks that WebSite Auditor works great when new clients come over, and the sales person needs to quickly evaluate the amount of work ahead and provide a quote. The analysis is quick and truly gives the big picture about the site's technical and on-page faults, and what needs to be fixed".
71
Free account - tryYes, you can get a 14-day, fully-featured trial
72
Referral linkWebSite Auditor Enterprise 10% off checkout
73
WebSite Auditor Professional 10% off checkout
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
Loading...
Main menu