Technical Audit Checklist (for Human Beings)
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
View only
 
 
ABCDEFGH
1
GradeTypeWhereStart hereReference
2
There is a technical reason good content isn’t indexed.
Fail
Outcome
3
URLs are not discoverable by crawlers.
FailCause
4
XML sitemaps aren't uploaded to GSC.PassIssueGoogle Search ConsoleCrawl > Sitemaps
https://support.google.com/webmasters/answer/183668?hl=en&ref_topic=4581190
5
XML sitemaps don't reflect the valid URLs on the site.OKIssueGoogle Search ConsoleCrawl > Sitemaps
https://support.google.com/webmasters/answer/183669?hl=en&ref_topic=4581190
6
Internal navigation breaks with basic JavaScript rendering capability.FailIssueScreaming Frog CrawlConfiguration > Spider > Rendering
7
There are more than ~300 links on important pages.PassIssueScreaming Frog CrawlOutlinks fieldhttps://www.youtube.com/watch?v=QHG6BkmzDEM
8
Important content is >4 clicks from the homepage.PassIssueScreaming Frog CrawlCrawl depth field
https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html
9
Robots.txt blocks content we want in the index.PassIssueGoogle Search ConsoleCrawl > robots.txt Testerhttps://support.google.com/webmasters/answer/6062608?hl=en
10
The website is timing out.PassIssueScreaming Frog CrawlStatus code
11
The site is down.PassIssueScreaming Frog CrawlStatus code
12
Bad URLs are being presented to crawlers as good.
FailCause
13
Error pages return 200 status codes.PassIssueScreaming Frog Crawl
Title tags (look for "404" or "Error" on a page that returns 200)
https://support.google.com/webmasters/answer/93641?hl=en&ref_topic=6001951
14
Internal links point to URLs returning 4XX or 5XX status codes.PassIssueDeepCrawl
Summary > All Pages > HTTP Status Breakdown
15
Robots.txt doesn't block URLs that don't belong in the index.PassIssueGoogle Search ConsoleCrawl > robots.txt Testerhttps://support.google.com/webmasters/answer/6062608?hl=en
16
Sitemaps contain valid URLs we want to keep out of the index.OKIssue
Screaming Frog Sitemap Crawl
17
Sitemaps contain invalid URLs.FailIssue
Screaming Frog Sitemap Crawl
https://support.google.com/webmasters/answer/183669?hl=en&ref_topic=4581190
18
Duplication is causing Google to ignore pages.
FailCause
19
Canonical tags don't associate duplicate content.PassIssueDeepCrawl
Content > Body Content > Duplicate Body Sets
https://support.google.com/webmasters/answer/139066?hl=en
20
URLs work with both HTTP or HTTPS.PassIssueScreaming Frog CrawlURL fieldhttps://support.google.com/webmasters/answer/6073543?hl=en
21
Duplicate content shows up on other domains or subdomains.OKIssueDeepCrawl
Content > Body Content > Duplicate Body Sets
https://support.google.com/webmasters/answer/66359?hl=en
22
Multiple URL patterns return the same content.FailIssueDeepCrawl
Content > Body Content > Duplicate Body Sets
https://support.google.com/webmasters/answer/66359?hl=en
23
Mobile markup isn't implemented.PassIssueDeepCrawlMobile > Categorization > Separate Mobilehttps://developers.google.com/webmasters/mobile-sites/mobile-seo/separate-urls
24
Our site serves too many unique pages.
FailCause
25
We have paginated content without using rel="next".PassIssueDeepCrawl
Indexation > Indexable Pages > Paginated 2+ Pages
https://support.google.com/webmasters/answer/1663744?hl=en
26
Faceted navigation results in an unbounded amount of content.OKIssueDeepCrawlSummary > Dashboard > Web Crawl Depth
https://googlewebmastercentral.blogspot.com/2014/02/faceted-navigation-best-and-5-of-worst.html
27
We haven't specified parameter behavior in GSC.FailIssueGoogle Search ConsoleCrawl > URL Parametershttps://support.google.com/webmasters/answer/6080550?hl=en
28
On-page content is not readable by crawlers.
PassCause
29
Page copy isn't visible with basic JavaScript rendering capability.PassIssueScreaming Frog CrawlConfiguration > Spider > Renderinghttps://developers.google.com/search/docs/guides/debug-rendering
30
We load crucial content in an iframe.PassIssueScreaming Frog CrawlCustom filter for "<iframe"https://support.google.com/webmasters/answer/34445?hl=en
31
We load crucial content in Flash.PassIssueManual Page Testinghttps://support.google.com/webmasters/answer/72746?hl=en#1
32
We serve different content to different user agents, including crawlers.PassIssueScreaming Frog CrawlConfiguration > User-Agenthttps://support.google.com/webmasters/answer/66355?hl=en
33
Mobile URLs don't resolve when accessed with various user agents.PassIssueScreaming Frog CrawlConfiguration > User-Agenthttps://developers.google.com/webmasters/mobile-sites/mobile-seo/
34
We haven't shown where we do and don't want content discovered or indexed.
PassCause
35
We don't have nofollow tags on links pointing to non-indexable content.PassIssueScreaming Frog Crawl
Configuration > Spider > Basic (uncheck Follow Internal "nofollow")
https://www.youtube.com/watch?v=86GHCVRReJs
36
We have noindex tags on content we want in the index.PassIssueScreaming Frog Crawl
Configuration > Spider > Basic (uncheck Follow Internal "nofollow")
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?csw=1
37
Our site doesn't respect mobile-first best practices
PassCause
38
We are missing links on our mobile site that are present on desktop.PassIssueScreaming Frog Crawlhttps://developers.google.com/search/mobile-sites/mobile-first-indexing
39
There is a technical reason indexed content doesn't rank for desired terms.
Fail
Outcome
40
Our internal linking doesn't convey the relative importance of our content.
PassCause
41
We use redirects that aren't 301s.PassIssueDeepCrawl
Indexation > Non-200 Status > Non-301 Redirects
https://support.google.com/webmasters/answer/93633?hl=en&ref_topic=6001951
42
We use JavaScript redirects.PassIssueManual Page Testinghttps://support.google.com/webmasters/answer/2721217?hl=en
43
We use meta refresh tags.PassIssueScreaming Frog CrawlMeta Refresh fieldhttps://support.google.com/webmasters/answer/79812?hl=en
44
We have redirect chains.PassIssueScreaming Frog CrawlReports > Redirect Chainshttps://support.google.com/webmasters/answer/6033086?hl=en
45
We link more to inconsequential content than important organic content.
PassIssueScreaming Frog CrawlInlinks field
https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html
46
Our targeting elements don't help crawlers understand our content.
PassCause
47
Title tags are longer than 60 characters.PassIssueDeepCrawl
Content > Titles & Descriptions > Max Title Length
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
48
Title tags are duplicated across pages.PassIssueDeepCrawl
Content > Titles & Descriptions > Pages with Duplicate Titles
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
49
Title tags are missing.PassIssueDeepCrawl
Content > Titles & Descriptions > Missing Titles
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
50
H1 tags are duplicated across pages.PassIssueScreaming Frog CrawlH1 field (sort by)
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
51
H1 tags are missing.PassIssueDeepCrawlContent > Body Content > Missing H1 Tags
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
52
Pages have multiple H1 tags.PassIssueDeepCrawl
Content > Body Content > Multiple H1 Tag Pages
http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
53
We haven't implemented bleeding-edge SEO best practices.
FailCause
54
We don't use HTTPS.FailIssueScreaming Frog CrawlURL field
https://googlewebmastercentral.blogspot.com/2014/08/https-as-ranking-signal.html
55
We are duplicating content that first appeared elsewhere.
PassCause
56
Content is scraped from other sources.OKIssueManual Page TestingLMGTFYhttps://support.google.com/webmasters/answer/2721312?hl=en
57
The site is slow enough that Google would prefer not to show it to searchers.
PassCause
58
Our site takes longer than 5 seconds to load.PassIssueChrome InspectorInspector > Networkhttps://developers.google.com/speed/pagespeed/
59
There is a technical reason site content isn't well-presented in search.
Pass
Outcome
60
We haven't indicated our preferred content.
PassCause
61
We don't use canonical tags to indicate pages we want ranking.PassIssueScreaming Frog CrawlCanonical fieldhttps://support.google.com/webmasters/answer/139066?hl=en
62
We haven't specified the canonical domain pattern in Webmaster Tools.
PassIssueGoogle Search ConsoleConfig > Site Settingshttps://support.google.com/webmasters/answer/44231?hl=en
63
We aren't showing the relative importance of our content through internal linking.
PassCause
64
We link more to inconsequential content than important organic content.
PassIssueScreaming Frog CrawlInlinks field
https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html
65
There are problems with our schema markup.
PassCause
66
Our schema markup is missing or incomplete.PassIssue
Structured Data Testing Tool
https://developers.google.com/structured-data/policies
67
Our schema markup is spammy.PassIssue
Structured Data Testing Tool
https://support.google.com/webmasters/answer/3498001?hl=en
68
We haven't signaled our international content.
PassCause
69
We haven't implemented hreflang across localized sites.PassIssueDeepCrawl
Config > Hreflang > Pages without Hreflang Tags
https://support.google.com/webmasters/answer/189077?hl=en
70
We haven't linked to all localized content.PassIssueScreaming Frog CrawlURL field
https://support.google.com/webmasters/answer/182192?hl=en&ref_topic=2370587
71
We haven't set GSC region targeting to reflect regions we're targeting.PassIssueGoogle Search Console
Search Traffic > International Targeting > Country
https://support.google.com/webmasters/answer/62399?hl=en
72
We're using URL parameters to distinguish localized content.PassIssueScreaming Frog CrawlURL field
https://support.google.com/webmasters/answer/182192?hl=en&ref_topic=2370587
73
We haven't signaled our mobile content.
PassCause
74
We're using dynamic serving but haven't implemented the Vary-HTTP header.
PassIssueDeepCrawlMobile > Categorization > Dynamically Served
https://developers.google.com/webmasters/mobile-sites/mobile-seo/dynamic-serving
Loading...
Main menu