Technical Audit Checklist (for Human Beings)
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
View only
 
 
ABCDEFGH
1
GradeTypeWhereStart hereReference
2
There is a technical reason good content isn’t indexed.
Fail
Outcome
3
URLs are not discoverable by crawlers.
FailCause
4
XML sitemaps aren't uploaded to GWT.PassIssueGoogle Search ConsoleCrawl > Sitemaps
https://support.google.com/webmasters/answer/183668?hl=en&ref_topic=4581190
5
XML sitemaps don't reflect the valid URLs on the site.OKIssueGoogle Search ConsoleCrawl > Sitemaps
https://support.google.com/webmasters/answer/183669?hl=en&ref_topic=4581190
6
Internal navigation breaks with basic JavaScript rendering capability.FailIssueScreaming Frog CrawlConfiguration > Spider > Rendering
7
There are more than ~300 links on important pages.PassIssueScreaming Frog CrawlOutlinks fieldhttps://www.youtube.com/watch?v=QHG6BkmzDEM
8
Important content is >4 clicks from the homepage.PassIssueScreaming Frog CrawlInlinks field
https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html
9
Robots.txt blocks content we want in the index.PassIssueGoogle Search ConsoleCrawl > robots.txt Testerhttps://support.google.com/webmasters/answer/6062608?hl=en
10
The website is timing out.PassIssueScreaming Frog CrawlStatus code
11
The site is down.PassIssueScreaming Frog CrawlStatus code
12
Bad URLs are being presented to crawlers as good.
FailCause
13
Error pages return 200 status codes.PassIssueScreaming Frog CrawlTitle tags (look for "404" or "Error" on a page that returns 200)https://support.google.com/webmasters/answer/93641?hl=en&ref_topic=6001951
14
Internal links point to URLs returning 4XX or 5XX status codes.PassIssueDeepCrawlSummary > All Pages > HTTP Status Breakdown
15
Robots.txt doesn't block URLs that don't belong in the index.PassIssueGoogle Search ConsoleCrawl > robots.txt Testerhttps://support.google.com/webmasters/answer/6062608?hl=en
16
Sitemaps contain valid URLs we want to keep out of the index.OKIssueScreaming Frog Sitemap Crawl
17
Sitemaps contain invalid URLs.FailIssueScreaming Frog Sitemap Crawl
https://support.google.com/webmasters/answer/183669?hl=en&ref_topic=4581190
18
Duplication is causing Google to ignore pages.
FailCause
19
Canonical tags don't associate duplicate content.PassIssueDeepCrawlContent > Body Content > Duplicate Body Setshttps://support.google.com/webmasters/answer/139066?hl=en
20
URLs work with both HTTP or HTTPS.PassIssueScreaming Frog CrawlURL fieldhttps://support.google.com/webmasters/answer/6073543?hl=en
21
Duplicate content shows up on other domains or subdomains.OKIssueDeepCrawlContent > Body Content > Duplicate Body Setshttps://support.google.com/webmasters/answer/66359?hl=en
22
Multiple URL patterns return the same content.FailIssueDeepCrawlContent > Body Content > Duplicate Body Setshttps://support.google.com/webmasters/answer/66359?hl=en
23
Mobile markup isn't implemented.PassIssueDeepCrawlMobile > Categorization > Separate Mobilehttps://developers.google.com/webmasters/mobile-sites/mobile-seo/separate-urls
24
Our site serves too many unique pages.
FailCause
25
We have paginated content without using rel="next".PassIssueDeepCrawlIndexation > Indexable Pages > Paginated 2+ Pageshttps://support.google.com/webmasters/answer/1663744?hl=en
26
Faceted navigation results in an unbounded amount of content.OKIssueDeepCrawlSummary > Dashboard > Web Crawl Depth
https://googlewebmastercentral.blogspot.com/2014/02/faceted-navigation-best-and-5-of-worst.html
27
We haven't specified parameter behavior in GWT.FailIssueGoogle Search ConsoleCrawl > URL Parametershttps://support.google.com/webmasters/answer/6080550?hl=en
28
On-page content is not readable by crawlers.
PassCause
29
Page copy isn't visible with basic JavaScript rendering capability.PassIssueScreaming Frog CrawlConfiguration > Spider > Renderinghttps://developers.google.com/search/docs/guides/debug-rendering
30
We load crucial content in an iframe.PassIssueScreaming Frog CrawlCustom filter for "<iframe"https://support.google.com/webmasters/answer/34445?hl=en
31
We load crucial content in Flash.PassIssueManual Page Testinghttps://support.google.com/webmasters/answer/72746?hl=en#1
32
We serve different content to different user agents, including crawlers.PassIssueScreaming Frog CrawlConfiguration > HTTP Header > User-Agenthttps://support.google.com/webmasters/answer/66355?hl=en
33
Mobile URLs don't resolve when accessed with various user agents.PassIssueScreaming Frog CrawlConfiguration > HTTP Header > User-Agenthttps://developers.google.com/webmasters/mobile-sites/mobile-seo/
34
We haven't indicated whether we want content discovered or indexed.
PassCause
35
We don't have nofollow tags on links pointing to non-indexable content.PassIssueScreaming Frog CrawlConfiguration > Spider > Basic (uncheck Follow Internal "nofollow")https://www.youtube.com/watch?v=86GHCVRReJs
36
We have noindex tags on content we want in the index.PassIssueScreaming Frog CrawlConfiguration > Spider > Basic (uncheck Follow Internal "nofollow")
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?csw=1
37
There is a technical reason indexed content doesn't rank for desired terms.
Fail
Outcome
38
Our internal linking doesn't convey the relative importance of our content.
PassCause
39
We use redirects that aren't 301s.PassIssueDeepCrawlIndexation > Non-200 Status > Non-301 Redirectshttps://support.google.com/webmasters/answer/93633?hl=en&ref_topic=6001951
40
We use JavaScript redirects.PassIssueManual Page Testinghttps://support.google.com/webmasters/answer/2721217?hl=en
41
We use meta refresh tags.PassIssueScreaming Frog CrawlMeta Refresh fieldhttps://support.google.com/webmasters/answer/79812?hl=en
42
We have redirect chains.PassIssueScreaming Frog CrawlReports > Redirect Chainshttps://support.google.com/webmasters/answer/6033086?hl=en
43
We link more to inconsequential content than important organic content.PassIssueScreaming Frog CrawlInlinks field
https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html
44
Our targeting elements don't help crawlers understand our content.
PassCause
45
Title tags are longer than 60 characters.PassIssueDeepCrawlContent > Titles & Descriptions > Max Title Length
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
46
Title tags are duplicated across pages.PassIssueDeepCrawlContent > Titles & Descriptions > Pages with Duplicate Titles
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
47
Title tags are missing.PassIssueDeepCrawlContent > Titles & Descriptions > Missing Titles
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
48
H1 tags are duplicated across pages.PassIssueScreaming Frog CrawlH1 field (sort by)
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
49
H1 tags are missing.PassIssueDeepCrawlContent > Body Content > Missing H1 Tags
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
50
Pages have multiple H1 tags.PassIssueDeepCrawlContent > Body Content > Multiple H1 Tag Pages
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
51
We haven't implemented bleeding-edge SEO best practices.
FailCause
52
We don't use HTTPS.FailIssueScreaming Frog CrawlURL field
https://googlewebmastercentral.blogspot.com/2014/08/https-as-ranking-signal.html
53
We are duplicating content that first appeared elsewhere.
PassCause
54
Content is scraped from other sources.OKIssueManual Page TestingLMGTFYhttps://support.google.com/webmasters/answer/2721312?hl=en
55
The site is slow enough that Google would prefer not to show it to searchers.
PassCause
56
Our site takes longer than 5 seconds to load.PassIssueChrome InspectorInspector > Networkhttps://developers.google.com/speed/pagespeed/
57
We have excessive requests for external resources.PassIssueChrome InspectorInspector > Networkhttps://developers.google.com/speed/pagespeed/
58
There is a technical reason site content isn't well-presented in search.
Pass
Outcome
59
We haven't indicated our preferred content.
PassCause
60
Many pages have large overlaps in targeting.PassIssueScreaming Frog CrawlTitle field (sort by)https://support.google.com/webmasters/answer/66359?hl=en
61
We don't use canonical tags to indicate pages we want ranking.PassIssueScreaming Frog CrawlCanonical fieldhttps://support.google.com/webmasters/answer/139066?hl=en
62
We haven't specified the canonical domain pattern in Webmaster Tools.PassIssueGoogle Search ConsoleConfig > Site Settingshttps://support.google.com/webmasters/answer/44231?hl=en
63
We aren't showing the relative importance of our content through internal linking.
PassCause
64
We link more to inconsequential content than important organic content.PassIssueScreaming Frog CrawlInlinks field
https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html
65
There are problems with our schema markup.
PassCause
66
Our schema markup is missing or incomplete.PassIssueGoogle Search ConsoleSearch Appearance > Structured Datahttps://developers.google.com/structured-data/policies
67
Our schema markup is spammy.PassIssueGoogle Search ConsoleSearch Appearance > Structured Datahttps://support.google.com/webmasters/answer/3498001?hl=en
68
We haven't signaled our international content.
PassCause
69
We haven't implemented hreflang across localized sites.PassIssueDeepCrawlConfig > Hreflang > Pages without Hreflang Tagshttps://support.google.com/webmasters/answer/189077?hl=en
70
We haven't linked to all localized content.PassIssueScreaming Frog CrawlURL field
https://support.google.com/webmasters/answer/182192?hl=en&ref_topic=2370587
71
We haven't set GSC region targeting to reflect regions we're targeting.PassIssueGoogle Search ConsoleSearch Traffic > International Targeting > Countryhttps://support.google.com/webmasters/answer/62399?hl=en
72
We're using URL parameters to distinguish localized content.PassIssueScreaming Frog CrawlURL field
https://support.google.com/webmasters/answer/182192?hl=en&ref_topic=2370587
73
We haven't signaled our mobile content.
PassCause
74
We're using dynamic serving but haven't implemented the Vary-HTTP header.PassIssueDeepCrawlMobile > Categorization > Dynamically Served
https://developers.google.com/webmasters/mobile-sites/mobile-seo/dynamic-serving
Loading...
 
 
 
v2
v1