Hop Online Technical SEO Audit Template 2019
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

View only
 
 
ABCDEFGHIJKLMNOPQRSTUVWXYZAA
1
Hop Online is a Moz recommended company.
2
Hop Online Technical SEO AuditGradePriorityRecommendation
Client Resolution
Resolution by
Implemented by
Implementation Date
Validated by
Validation Date
Audit Method UsedQuick Reference
3
SEO requirements to be met for optimal search performanceCompliance to SEO requirements: sort by (N/A= requirement is currently not relevant)Priority of action to be taken: sort byAction to be taken to meet SEO requirementsPlease mark if suggested action is acceptable (N/A = Not Applicable)Who made the resolutionWho implemented the recommendationWhen was the recommendation implementedWho validated the implementation When was the implementation validatedAudit method we used to discover if SEO requirements are metGoogle Guidelines (or other respectful source of information) on the SEO requirement
4
Crawling and indexationIncomplete
5
The website has an XML sitemap.Create an XML sitemap for your website.
6
XML sitemap is uploaded to Google Search Console.Upload your XML sitemap to Google Search Console. Google Search Console (GSC) > Crawl > Sitemaphttps://support.google.com/webmasters/answer/183668?hl=en
7
XML sitemap contains only pages with response code 200.XML sitemaps should contain only pages with response code 200. Some of the pages in your sitemap are returning status codes other than 200 OK. See the .... tab. Screaming Frog > Sitemap Crawlhttps://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
8
XML sitemap does not contain empty or low quality pages.XML sitemaps should contain only pages with enough content and search value. Make sure there are no empty or thin pages in your sitemap. We found that your sitemap contains the following empty and low quality pages: Screaming Frog > Sitemap Crawl + Manual checkhttps://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
9
XML sitemap does not contain too many pages. Group the major sections of your website together in individual sitemaps and link them from a sitemap index file. Each sitemap should contain less than 10 000 URLs. Screaming Frog > Sitemap Crawl + Manual checkhttps://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
10
HTML sitemap with important pages is linked internally.Create an HTML sitemap with the most important pages of your website and add a link to it in the website footer. Include the following pages: Manual Checkhttps://support.google.com/webmasters/answer/183668?hl=en
11
There is an acceptable number of 404 error pages reported in the Search Console.Fix 404 error pages found by 301 redirecting them to their most relevant active location. Do not mass redirect 404 error pages to the homepage. See Search Console 404s and Internal 404s tabs.Search Console > Crawl > Crawl Errorshttps://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
12
There is a custom 404 error page with a search form and all menu links.Create a compelling custom 404 error page with a search form and all menu links.Search Console > Crawl > Crawl Errorshttps://support.google.com/webmasters/answer/93641?hl=en
13
Internal links don't point to URLs returning 3XX, 4XX or 5XX status codes.Check your internal links in your website structure and make sure that all internal links return 200 OK status code. A list of your internal pages returning 3XX, 4XX or 5XX status codes can be found in the ..... tab of this spreadsheet.Screaming Frog > Status Codehttps://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
14
Internal linking is implemented with regular anchor tags within the HTML or the DOM (versus leveraging JavaScript functions).Internal linking should be implemented with regular anchor tags within the HTML or the DOM (versus leveraging JavaScript functions). Currently, JavaScript is used for links in the following cases: Screaming Frog Crawlhttps://searchengineland.com/google-will-stop-crawling-old-ajax-crawling-scheme-q2-2018-287653
15
Absolute (full) URL paths are used in the code.Use only absolute (full) URL paths in your code. Currently, relative URL paths are used in the following cases:
16
CSS and JavaScript are not disallowed in robots.txt.Allow crawling of CSS and JavaScript so that Google can see your site like an average user. Disallowing these resources can lead to suboptimal rankings. Remove the following directives from your robots.txt file: Robots.txthttps://developers.google.com/search/mobile-sites/mobile-seo/common-mistakes
17
Important pages are not disallowed in robots.txt.Important pages should not be disallowed in robots.txt. Remove the following directives from your robots.txt file: Robots.txthttps://support.google.com/webmasters/answer/6062608?hl=en
18
Important pages do not have "noindex, follow" meta tag.Important pages should not contain "noindex, follow" meta tag. Replace the "noindex, follow" tag with "index,follow" on these pages: Screaming Frog > Meta Robotshttps://developers.google.com/search/reference/robots_meta_tag?csw=1
19
Important pages have only self referencing canonical tags, if any.Important pages can contain only self referencing canonical tags. Change the canonical tags on the following pages: Screaming Frog > Canonical Link Elementhttps://support.google.com/webmasters/answer/139066?hl=en
20
Important content is not loaded in Flash.Important content should not be loaded in Flash. Make sure that the following content is published as text: Manual Checkhttps://support.google.com/webmasters/answer/35769?hl=en
21
Important content is not embedded inside images.Important content should not be embedded inside images. Make sure that the following content is published as text: Manual Checkhttps://support.google.com/webmasters/answer/35769?hl=en
22
Site architectureIncomplete
23
Website navigation contains canonical links only.Website navigation should contain canonical links only. The following links in your website navigation are not canonical: Manual checkhttps://support.google.com/webmasters/answer/139066?hl=en
24
Breadcrumb navigation is available.Add breadcrumb navigation to your website.Manual check on a sample of pageshttps://moz.com/learn/seo/internal-link
25
Company logo is linking to the canonical homepage URL.Your company logo should be linking to the canonical homepage URL, which is ........Manual check on a sample of pageshttps://moz.com/learn/seo/internal-link
26
Important internal pages are up to 3 levels deep.Make sure that important pages are up to 3 levels deep in your website structure. The following pages are more than 3 levels deep: ... See your website crawl depth in the Crawl depth tab of this spreadsheet.
27
There are vertical links accross the website.Establish the following vertical links accross the website: Manual check on a sample of pageshttps://moz.com/learn/seo/internal-link
28
There are horizontal links accross the website.Establish the following horizontal links accross the website:Manual check on a sample of pageshttps://moz.com/learn/seo/internal-link
29
There are contextual links accross the website.Establish the following contextual links accross the website:Manual check on a sample of pageshttps://moz.com/learn/seo/internal-link
30
Important internal pages have more than 20 unique inlinks.Improve the internal linking structure of your website. The following important internal pages should have at least 20 unique inlinks pointing to them: (i.e. product pages, category pages, etc.) Prioritize your internal pages based on their search value and traffic potential. You can see the inlinks data in the Inlinks and outlinks tab.Screaming Frog > Unique Inlinks and a Manual checkhttps://moz.com/learn/seo/internal-link
31
Important internal pages have less than 50 unique outlinks.Improve the internal linking structure of your website. The following important internal pages should have less than 50 unique outlinks: (i.e. product pages, category pages, etc.). Prioritize your internal pages based on their search value and traffic potential. You can see the inlinks data in the Inlinks and outlinks tab.Screaming Frog > Unique Inlinks and a Manual checkhttps://moz.com/learn/seo/internal-link
32
Duplicate content handlingIncomplete
33
Www or non-www version of the website is specified by 301 redirect301 redirect the non-www https version of the website to the https://www one. Make sure that each non-www https URLs is redirected to its equivalent www https one. Ahrefs
34
Pages that should not be crawled are disallowed in robots.txt.Add the following disallow directives to your robots.txt file: Robots.txthttps://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
35
Parameter behavior is handled correctly in the Search Console or in the robots.txt file. Set the following parameters to No URLs in the Search Console: ... Use the following robots.txt directives to disallow parameter crawling: Google Search Console > Crawl > URL Parametershttps://support.google.com/webmasters/answer/6080550?hl=en
36
Canonical tags associate duplicate content that needs to be served under different URLs.Add canonical tags on the following types of pages: ....... Canonical tags should be as follows: .......Screaming Frog > Canonicalshttps://support.google.com/webmasters/answer/139066?hl=en
37
Faceted navigation does not result in an unbounded amount of content.Add a "noindex, follow" tag to faceted navigation pages when more than 2 filters are used. Manual Checkhttps://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
38
Paginated content uses rel="next" and rel="prev".Implement rel="next" and rel="prev" to your pagination pages or create a SeeAll page and add canonical tag pointing to it on sequence pages.Configuration > Spider > Basic Tab > Crawl Next/Prevhttps://support.google.com/webmasters/answer/1663744?hl=en
39
There are no uppercase URLs.Use a 301 redirect to move uppercase URLs to their lowercase equivalents.Screaming Frog > Internal crawlhttps://support.google.com/webmasters/answer/66359?hl=en
40
Website is secured with HTTPs.Make the website secure with HTTPs.
41
There are no HTTP and HTTPs URLs that show the same content301 redirect all HTTP URLs to their HTTPs equivalentsScreaming Frog > Internal crawlhttps://support.google.com/webmasters/answer/66359?hl=en
42
HTTPs version is added to Google Search ConsoleAdd the HTTPS property to Search Console.Google Search Consolehttps://support.google.com/webmasters/answer/6073543?hl=en
43
Site speedIncomplete
44
The website takes < 3 seconds to loadYour website should take less than 3 seconds to load.GSC > Crawl > Crawl Stats > Time spent downloading a page (in Keylliseconds)https://developers.google.com/speed/
45
Server response time is under 200msYour server responded in .... seconds. There are many factors that can slow down your server response time.Page Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/Server
46
Compression is enabledEnable and test gzip compression support on your web server.Page Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/EnableCompression
47
Visible content is prioritizedYour page requires additional network round trips to render the above-the-fold content. For best performance, reduce the amount of HTML needed to render above-the-fold content.
48
CSS is minifiedTo Minify CSS, try CSSNano and csso.Page Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/KeynifyResources
49
JavaScript is minifiedTo Minify JavaScript, try UglifyJS.Page Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/KeynifyResources
50
HTML is minifiedTo Minify HTML, try HTMLminifierPage Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/KeynifyResources
51
Browser caching is leveragedSetting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.Page Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/LeverageBrowserCaching
52
No render-blocking JavaScript and CSS in above-the-fold contentEliminate render-blocking JavaScript and CSS in above-the-fold content. Your page has 4 blocking script resources and 4 blocking CSS resources. This causes a delay in rendering your page.Page Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/OptiKeyzeCSSDelivery
53
No landing page redirectsAvoid landing page redirectsPage Speed Tools > Insightshttps://developers.google.com/speed/docs/insights/AvoidRedirects https://developers.google.com/search/mobile-sites/mobile-seo/separate-urls#automatic-redirection
54
Images are < 100 kbKeep images under 100 kbScreaming Frog > Images > Alt text over 100 charactershttps://developers.google.com/web/fundamentals/performance/optiKeyzing-content-efficiency/image-optiKeyzation#image_optiKeyzation_checklist
55
MobileIncomplete
56
Mobile page speed is < 3 secondsMake sure your mobile page speed is less than 2 seconds.https://testmysite.thinkwithgoogle.com/https://developers.google.com/speed/
57
Clickable elements are not too close togetherSize and space buttons and navigational links to be suitable for your mobile visitors.GSC > Search Traffic > Mobile Usabilityhttps://support.google.com/webmasters/answer/6101188?hl=en
58
Text is not too small to readAfter specifying a viewport for your web pages, set your font sizes to scale properly within the viewport.GSC > Search Traffic > Mobile Usabilityhttps://support.google.com/webmasters/answer/6101188?hl=en
59
All plug-ins are compatibleMake sure all plugins are compatible.GSC > Search Traffic > Mobile Usabilityhttps://support.google.com/webmasters/answer/6101188?hl=en
60
Viewport is specifiedYour pages should specify a viewport using the meta viewport tagGSC > Search Traffic > Mobile Usabilityhttps://support.google.com/webmasters/answer/6101188?hl=en
61
Fixed-width viewport is not usedAdopt a responsive design for your site’s pages, and set the viewport to match the device’s width and scale accordingly.GSC > Search Traffic > Mobile Usabilityhttps://support.google.com/webmasters/answer/6101188?hl=en
62
Content is sized to viewportMake sure the pages use relative width and position values for CSS elements, and make sure images can scale as well.GSC > Search Traffic > Mobile Usabilityhttps://support.google.com/webmasters/answer/6101188?hl=en
63
Flash is not usedWe recommend designing your look and feel and page animations using modern web technologiesGSC > Search Traffic > Mobile Usabilityhttps://support.google.com/webmasters/answer/6101188?hl=en
64
Meta dataIncomplete
65
Title tags are correct lengthMake sure your title tags are about 50-60 characters long and contain targeted keywords.Screaming Frog > Title 1https://support.google.com/webmasters/answer/79812?hl=en
66
Tltle tags are not duplicatedCreate individual title tags for your pages. GSC > Search Appearance > HTML Improvements / SF > Page Titles > Duplicatehttps://support.google.com/webmasters/answer/79812?hl=en
67
Meta descriptions are correct lengthMeta descriptions should be about 300 characters long. Screaming Frog > Meta descriptionhttps://support.google.com/webmasters/answer/79812?hl=en
68
Meta descriptions are not duplicatedCreate compelling meta descriptions with call to action. Make sure meta descritptions are not duplicated.GSC > Search Appearance > HTML Improvements / SF > Meta Descriptions > Duplicatehttps://support.google.com/webmasters/answer/79812?hl=en
69
Structured data is correctly implementedMake sure structured data is correctly implemented. Pages with structured data errors are listed in the Structured data errors tab. GSC > Search Appearance > Structured Data / Manual check on sample of pages / Google Structured Data Testing Toolhttp://schema.org/docs/gs.html
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
Loading...
Main menu