A | B | C | D | E | F | G | H | |
---|---|---|---|---|---|---|---|---|
1 | Grade | Type | Where | Start here | Reference | |||
2 | There is a technical reason good content isn’t indexed. | Fail | Outcome | |||||
3 | URLs are not discoverable by crawlers. | Fail | Cause | |||||
4 | XML sitemaps aren't uploaded to GSC. | Pass | Issue | Google Search Console | Index > Sitemaps | https://support.google.com/webmasters/answer/183668 | ||
5 | XML sitemaps don't reflect the valid URLs on the site. | OK | Issue | Google Search Console | Index > Sitemaps | https://support.google.com/webmasters/answer/7451001 | ||
6 | Internal navigation breaks without basic JavaScript rendering capability. | Fail | Issue | Screaming Frog Crawl | Configuration > Spider > Rendering | |||
7 | There are more than ~300 links on important pages. | Pass | Issue | Screaming Frog Crawl | Outlinks field | https://www.youtube.com/watch?v=QHG6BkmzDEM | ||
8 | Important content is >4 clicks from the homepage. | Pass | Issue | Screaming Frog Crawl | Crawl depth field | https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html | ||
9 | Robots.txt blocks content we want in the index. | Pass | Issue | Google Search Console | Index > Coverage > Submitted URL blocked by robots.txt | https://support.google.com/webmasters/answer/6062608 | ||
10 | The website is timing out. | Pass | Issue | Screaming Frog Crawl | Status code | |||
11 | The site is down. | Pass | Issue | Screaming Frog Crawl | Status code | |||
12 | Bad URLs are being presented to crawlers as good. | Fail | Cause | |||||
13 | Error pages return 200 status codes. | Pass | Issue | Google Search Console | Index > Coverage > Submitted URL seems to be a Soft 404 | https://support.google.com/webmasters/answer/93641 | ||
14 | Internal links point to URLs returning 4XX or 5XX status codes. | Pass | Issue | DeepCrawl | Summary > All Pages > HTTP Status Breakdown | |||
15 | Robots.txt doesn't block URLs that don't belong in the index. | Pass | Issue | Screaming Frog Crawl | Configure > robots.txt > Settings > Respect robots.txt | https://support.google.com/webmasters/answer/6062608 | ||
16 | Sitemaps contain valid URLs we want to keep out of the index. | OK | Issue | Screaming Frog Sitemap Crawl | List mode > Upload > Download sitemap | |||
17 | Sitemaps contain invalid URLs. | Fail | Issue | Google Search Console | Index > Sitemaps > See index coverage | https://support.google.com/webmasters/answer/183669 | ||
18 | Duplication is causing Google to ignore pages. | Fail | Cause | |||||
19 | Canonical tags don't associate duplicate content. | Pass | Issue | DeepCrawl | Content > Body Content > Duplicate Body Sets | https://support.google.com/webmasters/answer/139066 | ||
20 | URLs work with both HTTP or HTTPS. | Pass | Issue | Screaming Frog Crawl | URL field | https://support.google.com/webmasters/answer/6073543 | ||
21 | Duplicate content shows up on other domains or subdomains. | OK | Issue | DeepCrawl | Content > Body Content > Duplicate Body Sets | https://support.google.com/webmasters/answer/66359 | ||
22 | Multiple URL patterns return the same content. | Fail | Issue | DeepCrawl | Content > Body Content > Duplicate Body Sets | https://support.google.com/webmasters/answer/66359 | ||
23 | Mobile markup isn't implemented. | Pass | Issue | DeepCrawl | Mobile > Categorization > Separate Mobile | https://developers.google.com/webmasters/mobile-sites/mobile-seo/separate-urls | ||
24 | Our site serves too many unique pages. | Fail | Cause | |||||
25 | Faceted navigation results in an unbounded amount of content. | OK | Issue | DeepCrawl | Summary > Dashboard > Web Crawl Depth | https://googlewebmastercentral.blogspot.com/2014/02/faceted-navigation-best-and-5-of-worst.html | ||
26 | We haven't specified parameter behavior in GSC. | Fail | Issue | Google Search Console | Legacy tools > URL parameters | https://support.google.com/webmasters/answer/6080550 | ||
27 | On-page content is not readable by crawlers. | Pass | Cause | |||||
28 | Page copy isn't visible with basic JavaScript rendering capability. | Pass | Issue | Screaming Frog Crawl | Configuration > Spider > Rendering | https://developers.google.com/search/docs/guides/fix-search-javascript | ||
29 | We load crucial content in an iframe. | Pass | Issue | Screaming Frog Crawl | Custom filter for "<iframe" | https://support.google.com/webmasters/answer/34445 | ||
30 | We load crucial content in Flash. | Pass | Issue | Screaming Frog Crawl | Configuration > Spider > Check SWF | https://support.google.com/webmasters/answer/72746#1 | ||
31 | We serve different content to different user agents, including crawlers. | Pass | Issue | Screaming Frog Crawl | Configuration > User-Agent | https://support.google.com/webmasters/answer/66355 | ||
32 | Mobile URLs don't resolve when accessed with various user agents. | Pass | Issue | Screaming Frog Crawl | Configuration > User-Agent | https://developers.google.com/webmasters/mobile-sites/mobile-seo/ | ||
33 | We haven't shown where we do and don't want content discovered or indexed. | Pass | Cause | |||||
34 | We don't have nofollow tags on links pointing to non-indexable content. | Pass | Issue | Screaming Frog Crawl | Configuration > Spider > Basic (uncheck Follow Internal "nofollow") | https://www.youtube.com/watch?v=86GHCVRReJs | ||
35 | We have noindex tags on content we want in the index. | Pass | Issue | Screaming Frog Crawl | Configuration > Spider > Basic (uncheck Follow Internal "nofollow") | https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?csw=1 | ||
36 | Our site doesn't respect mobile-first best practices | Pass | Cause | |||||
37 | We are missing links on our mobile site that are present on desktop. | Pass | Issue | Screaming Frog Crawl | Configuration > User-Agent | https://developers.google.com/search/mobile-sites/mobile-first-indexing | ||
38 | There is a technical reason indexed content doesn't rank for desired terms. | Fail | Outcome | |||||
39 | Our internal linking doesn't convey the relative importance of our content. | Pass | Cause | |||||
40 | We use redirects that aren't 301s. | Pass | Issue | DeepCrawl | Indexation > Non-200 Status > Non-301 Redirects | https://support.google.com/webmasters/answer/93633 | ||
41 | We use JavaScript redirects. | Pass | Issue | Manual Page Testing | https://support.google.com/webmasters/answer/2721217 | |||
42 | We use meta refresh tags. | Pass | Issue | Screaming Frog Crawl | Meta Refresh field | https://support.google.com/webmasters/answer/79812 | ||
43 | We have redirect chains. | Pass | Issue | Screaming Frog Crawl | Reports > Redirect Chains | https://support.google.com/webmasters/answer/6033086 | ||
44 | We link more to inconsequential content than important organic content. | Pass | Issue | Screaming Frog Crawl | Inlinks field | https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html | ||
45 | Our targeting elements don't help crawlers understand our content. | Pass | Cause | |||||
46 | Title tags are longer than 60 characters. | Pass | Issue | DeepCrawl | Content > Titles & Descriptions > Max Title Length | http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf | ||
47 | Title tags are duplicated across pages. | Pass | Issue | DeepCrawl | Content > Titles & Descriptions > Pages with Duplicate Titles | http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf | ||
48 | Title tags are missing. | Pass | Issue | DeepCrawl | Content > Titles & Descriptions > Missing Titles | http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf | ||
49 | H1 tags are duplicated across pages. | Pass | Issue | Screaming Frog Crawl | H1 field (sort by) | http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf | ||
50 | H1 tags are missing. | Pass | Issue | DeepCrawl | Content > Body Content > Missing H1 Tags | http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf | ||
51 | Pages have multiple H1 tags. | Pass | Issue | DeepCrawl | Content > Body Content > Multiple H1 Tag Pages | http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf | ||
52 | We haven't implemented user-first practices desired by Google. | Fail | Cause | |||||
53 | We don't use HTTPS. | Fail | Issue | Screaming Frog Crawl | URL field | https://googlewebmastercentral.blogspot.com/2014/08/https-as-ranking-signal.html | ||
54 | We use an interstitial to present some content on load. | Fail | Issue | Manual Page Testing | https://webmasters.googleblog.com/2016/08/helping-users-easily-access-content-on.html | |||
55 | We are duplicating content that first appeared elsewhere. | Pass | Cause | |||||
56 | Content is scraped from other sources. | OK | Issue | Manual Page Testing | LMGTFY | https://support.google.com/webmasters/answer/2721312 | ||
57 | The site is slow enough that Google would prefer not to show it to searchers. | Pass | Cause | |||||
58 | Our site takes longer than 5 seconds to load. | Pass | Issue | Chrome Inspector | Inspector > Network | https://developers.google.com/speed/pagespeed/ | ||
59 | There is a technical reason site content isn't well-presented in search. | Pass | Outcome | |||||
60 | We haven't indicated our preferred content. | Pass | Cause | |||||
61 | We don't use canonical tags to indicate pages we want ranking. | Pass | Issue | Screaming Frog Crawl | Canonical field | https://support.google.com/webmasters/answer/139066?hl=en | ||
62 | We aren't showing the relative importance of our content through internal linking. | Pass | Cause | |||||
63 | We link more to inconsequential content than important organic content. | Pass | Issue | Screaming Frog Crawl | Inlinks field | https://googlewebmastercentral.blogspot.com/2008/10/importance-of-link-architecture.html | ||
64 | There are problems with our schema markup. | Pass | Cause | |||||
65 | Our schema markup is missing or incomplete. | Pass | Issue | Structured Data Testing Tool | https://developers.google.com/structured-data/policies | |||
66 | Our schema markup is spammy. | Pass | Issue | Structured Data Testing Tool | https://support.google.com/webmasters/answer/3498001?hl=en | |||
67 | We haven't signaled our international content. | Pass | Cause | |||||
68 | We haven't implemented hreflang across localized sites. | Pass | Issue | DeepCrawl | Config > Hreflang > Pages without Hreflang Tags | https://support.google.com/webmasters/answer/189077?hl=en | ||
69 | We haven't linked to all localized content. | Pass | Issue | Screaming Frog Crawl | URL field | https://support.google.com/webmasters/answer/182192?hl=en&ref_topic=2370587 | ||
70 | We haven't set GSC region targeting to reflect regions we're targeting. | Pass | Issue | Google Search Console | Legacy tools > International targeting > Country | https://support.google.com/webmasters/answer/62399?hl=en | ||
71 | We're using URL parameters to distinguish localized content. | Pass | Issue | Screaming Frog Crawl | URL field | https://support.google.com/webmasters/answer/182192?hl=en&ref_topic=2370587 | ||
72 | We haven't signaled our mobile content. | Pass | Cause | |||||
73 | We're using dynamic serving but haven't implemented the Vary-HTTP header. | Pass | Issue | DeepCrawl | Mobile > Categorization > Dynamically Served | https://developers.google.com/webmasters/mobile-sites/mobile-seo/dynamic-serving |