[Client Logo]
Website: SEO Audit Overview
Legend:This technical SEO audit addresses issues which may affect the ability of the The Pi Hut website, located at, to rank well in organic search for competitive terms. The issues that have been identified are followed by a recommendation based on industry best practices. Where the recommendation is not possible, there may be alternative actions to mitigate the effects of the issue.
Low Priority
Medium Priority
High Priority
ItemPriority (SEO Benefit)EffortOwnerIssue NameIssue DescriptionSolutionSearch Performance Benefit
1HighHighClientHTTP URLsHTTP URLs that are encountered in the crawl. All websites should be secure over HTTPS today on the web. Not only is it important for security, but it's now expected by users. Chrome and other browsers display a 'Not Secure' message against any URLs that are HTTP, or have mixed content issues (where they load insecure resources on them). To view how these URLs were discovered, view their 'inlinks' in the lower window tab. You can also export any pages that link to HTTP URLs via 'Bulk Export > Security > HTTP URLs Inlinks'.All URLs should be to secure HTTPS pages. Pages should be served over HTTPS, any internal links should be updated to HTTPS versions and HTTP URLs should 301 redirect to HTTPS versions. HTTP URLs identified in this filter that are redirecting to HTTPS versions already should be updated to link to the correct HTTPS versions directly.Search Engines will trust this site as a safe site that provides valuable information to users which could result to boosting the site in the search results.
2MediumMediumClimb OnlineMixed ContentHTML pages loaded over a secure HTTPS connection that have resources such as images, JavaScript or CSS that are loaded via an insecure HTTP connection. Mixed content weakens HTTPS, and makes the pages easier for eavesdropping and compromising otherwise secure pages. Browsers might automatically block the HTTP resources from loading, or they may attempt to upgrade them to HTTPS. HTTP resources can be viewed in the 'outlinks' tab for each URL, and exported alongside the pages they are on via 'Bulk Export > Security > Mixed Content'.

All HTTP resources should be changed to HTTPS to avoid security issues, and problems loading in a browser.This adds to the trust building of search engines with the website as all resources loaded on the site is in HTTPS(secure) version.
3LowLowClimb OnlineUnsafe Cross-Origin LinksURLs that link to external websites using the target="_blank" attribute (to open in a new tab), without using rel="noopener" (or rel="noreferrer") at the same time. Using target="_blank" alone leaves those pages exposed to both security and performance issues.The rel="noopener" link attribute should be used on any links that contain the target="_blank" attribute to avoid security and performance issues.It doesn't have any search performance benefit since this is just a standard on setting external links on websites.
LowLowClientProtocol-Relative Resource LinksURLs that load resources such as images, JavaScript and CSS using protocol-relative links. A protocol-relative link is simply a link to a URL without specifying the scheme (for example, // It helps save developers time from having to specify the protocol and lets the browser determine it based upon the current connection to the resource. However, this technique is now an anti-pattern with HTTPS everywhere, and can expose some sites to 'man in the middle' compromises and performance issues.Update any resource links to be absolute links including the scheme (HTTPS) to avoid security and performance issues.
HighHighClientMissing HSTS HeaderURLs that are missing the HSTS response header. The HTTP Strict-Transport-Security response header (HSTS) instructs browsers that it should only be accessed using HTTPS, rather than HTTP. If a website accepts a connection to HTTP, before being redirected to HTTPS, visitors will initially still communicate over HTTP. The HSTS header instructs the browser to never load over HTTP and to automatically convert all requests to HTTPS.
The HSTS header should be used across all pages to instruct the browser that it should always request pages via HTTPS, rather than HTTP.
LowLowClientMissing Content-Security-Policy HeaderURLs that are missing the Content-Security-Policy response header. This header allows a website to control which resources are loaded for a page.Set a strict Content-Security-Policy response header across all page to help mitigate cross site scripting (XSS) and data injection attacks. The SEO Spider only checks for existence of the header, and does not interrogate the policies found within the header to determine whether they are well set-up for the website. This should be performed manually.This policy can help guard against cross-site scripting (XSS) attacks that exploit the browser's trust of the content received from the server.
LowLowClimb OnlineMissing X-Content-Type-Options HeaderURLs that are missing the 'X-Content-Type-Options' response header with a 'nosniff' value. In the absence of a MIME type, browsers may 'sniff' to guess the content type to interpret it correctly for users. However, this can be exploited by attackers who can try and load malicious code, such as JavaScript via an image they have compromised.To minimise security issues, the X-Content-Type-Options response header should be supplied and set to 'nosniff'. This instructs browsers to rely only on the Content-Type header and block anything that does not match accurately. This also means the content-type set needs to be accurate.
LowLowClimb OnlineMissing X-Frame-Options HeaderURLs missing an X-Frame-Options response header with a 'DENY' or 'SAMEORIGIN' value. This instructs the browser not to render a page within a frame, iframe, embed or object.To minimise security issues, the X-Frame-Options response header should be supplied with a 'DENY' or 'SAMEORIGIN' value.This helps avoid 'clickjacking' attacks, where your content is displayed on another web page that is controlled by an attacker.
LowLowClimb OnlineMissing Secure Referrer-Policy HeaderURLs missing 'no-referrer-when-downgrade', 'strict-origin-when-cross-origin', 'no-referrer' or 'strict-origin' policies in the Referrer-Policy header. When using HTTPS, it's important that the URLs do not leak in non-HTTPS requests. This can expose users to 'man in the middle' attacks, as anyone on the network can view them.Consider setting a referrer policy of strict-origin-when-cross-origin. It retains much of the referrer's usefulness, while mitigating the risk of leaking data cross-origins.
LowLowClimb OnlineBad Content TypeThis indicates any URLs where the actual content type does not match the content type set in the header. It also identifies any invalid MIME types used. When the X-Content-Type-Options: nosniff response header is set by the server this is particularly important, as browsers rely on the content type header to correctly process the page. This can cause HTML web pages to be downloaded instead of being rendered when they are served with a MIME type other than text/html for example.Analyse URLs identified with a bad content type, and set an accurate MIME type in the content-type header.
LowLowClimb OnlineBlocked by Robots.txtInternal and External URLs blocked by the site's robots.txt. This means they cannot be crawled and is a critical issue if you want the page content to be crawled and indexed by search engines.Review URLs to ensure they should be disallowed. If they are incorrectly disallowed, then the site's robots.txt should be updated to allow them to be crawled. Consider whether you should be linking internally or externally to these URLs and remove links where appropriate.
MediumMediumClientNo ResponseInternal and External URLs with no response returned from the server. Usually due to a malformed URL, connection timeout, connection error, or connection refused.
Unresponsive URLs should be updated to the correct location and other connection issues can often be resolved by using different user-agents ('Config > User-Agent'), adjusting the crawl speed ('Config > Speed') or disabling firewalls & proxies.
HighHighClimb OnlineRedirection (3xx)Internal and external URLs which redirect to another URL. These will include server-side redirects, such as 301 or 302 redirects (and more).Ideally all internal links would be to canonical resolving URLs, and avoid linking to URLs that redirect. This reduces latency of redirect hops for users, and enhanced efficiency for search engines.Redirection of old or non-existent pages leads the users to similar pages that could potentially provide the information they are looking for, which maintains the user experience on the website.
HighHighClimb OnlineClient Error (4xx)Internal and external URLs with a client-side error. This indicates a problem occurred with the URL request and can include responses such as 400 bad request, 403 Forbidden, 404 Page Not Found, 410 Removed, 429 Too Many Requests and more. A 404 'Page Not Found' is the most common, and often referred to as a broken link.All links on a website should ideally resolve to 200 'OK' URLs. Errors such as a 404 or 410 should be updated to their correct locations, removed and redirected where appropriate. A 403 forbidden error occurs when a web server denies access to the SEO Spider's request.It will allocate the crawling to pages that are in 200 "OK" status which could lead to more live pages being indexed.
HighHighClient?Server Error (5xx)Internal and external URLs where the server failed to fulfill an apparently valid request. This can include common responses such as 500 Internal Server Errors and 503 Service Unavailable.All URLs should respond with a 200 'OK' status and this might indicate a server that struggles under load, or a misconfiguration that requires investigation. If they can be viewed in a browser, then it's often not an issue.
LowLowClimb OnlineNon ASCII CharactersURLs with characters outside of the ASCII character-set. Standards outline that URLs can only be sent using the ASCII character-set and some users may have difficulty with subtleties of characters outside this range.URLs should be converted into a valid ASCII format, by encoding links to the URL with safe characters (made up of % followed by two hexadecimal digits). Today browsers and the search engines are largely able to transform URLs accurately.
LowLowClimb OnlineUnderscoresURLs with underscores, which are not always seen as word separators by search engines.
Ideally hyphens should be used as word separators, rather than underscores. However, changing URLs is a big decision, and often it's not worth changing them for SEO purposes alone. If URLs are changed, then appropriate 301 redirects must be implemented.
LowLowClimb OnlineUppercasesURLs that have uppercase characters within them. URLs are case sensitive, so as best practice generally URLs should be lowercase, to avoid any potential mix ups and duplicate URLs.Ideally lowercase characters should be used for URLs only. However, changing URLs is a big decision, and often it's not worth changing them for SEO purposes alone. If URLs are changed, then appropriate 301 redirects must be implemented.
LowLowClimb OnlineRepetitive PathURLs that have repetitive paths or subfolders within the string ( In some cases this can be legitimate and logical, however it also often points to poor URL structure and potential improvements. It can also help identify issues with incorrect relative linking, causing infinite URLs.Ideally any URL should be as concise as possible. However, changing URLs is a big decision, and often it's not worth changing them for SEO purposes alone. If URLs are changed, then appropriate 301 redirects must be implemented.While not always an issue, repetitive paths aren't particularly user friendly, or could be causing crawling issues if due to incorrect relative linking.
LowLowClimb OnlineURLs Contains SpaceURLs that contain a space. These are considered unsafe and could cause the link to be broken when sharing the URL.
Hyphens should be used as word separators instead of spaces.
LowLowClimb OnlineInternal SearchURLs that might be part of the websites internal search function. Google and other search engines recommend blocking internal search pages from being crawled to limit sometimes duplicate and low quality pages from being crawled and indexed.
Most internal site searches are built for users, rather than search engines who may needlessly crawl and index them. As best practise most search sections of a site should not be linked to internally, and should be disallowed in robots.txt.
LowLowClimb OnlineURL ParametersURLs that include parameters such as '?' or '&'.Where possible use a static URL structure without parameters for key indexable URLs.This isn't an issue for Google or other search engines to crawl unless at significant scale, but it's recommended to limit the number of parameters in a URL which can be complicated for users, and can be a sign of low value-add URLs.
LowLowClimb OnlineURLs Over 115 CharactersURLs that are more than the configured length.Where possible use logical and concise URLs for users and search engines.This is generally not an issue, however research has shown that users prefer shorter, concise URL strings.
HighHighClimb OnlineMissing Page TitlePages which have a missing page title element, the content is empty, or has a whitespace. Page titles are read and used by both users and the search engines to understand what a page is about.It's essential to write concise, descriptive and unique page titles on every indexable URL to help users, and enable search engines to score and rank the page for relevant search queries.They are important for SEO as page titles are used in rankings, and vital for user experience, as they are displayed in browsers, search engine results and on social networks.
HighHighClimb OnlineDuplicate Page TitlePages which have duplicate page titles.Update duplicate page titles as necessary, so each page contains a unique and descriptive title for users and search engines. If these are duplicate pages, then fix the duplicated pages by linking to a single version, and redirect or use canonicals where appropriate.It's really important to have distinct and unique page titles for every page. If every page has the same page title, then it can make it more challenging for users and the search engines to understand one page from another.
HighHighClimb OnlinePage Title Over 60 CharactersPages which have page titles that exceed the configured limit.

Write concise page titles to ensure important words are not truncated in the search results, not visible to users and potentially weighted less in scoring.Characters over the limit might be truncated in Google's search results and carry less weight in scoring.
HighHighClimb OnlinePage Title Below 30 CharactersPages which have page titles under the configured limit.
Consider updating the page title to take advantage of the space left to include additional target keywords or USPs.This isn't necessarily an issue, but it does indicate there might be room to target additional keywords or communicate your USPs.
LowLowClimb OnlinePage Title Same as H1Page titles which match the h1 on the page exactly.This is not necessarily an issue, but may point to a potential opportunity to target alternative keywords, synonyms, or related key phrases.
HighHighClimb OnlineMissing Meta DescriptionPages which have a missing meta description, the content is empty or has a whitespace.It's important to write unique and descriptive meta descriptions on key pages to communicate the purpose of the page to users, and entice them to click on your result over the competition. It can also mean Google use this description for snippets in the search results for some queries, rather than make up their own based upon the content of the page.This is a opportunity to communicate the benefits of your product or service and influence click through rates for important URLs.
HighHighClimb OnlineDuplicate Meta DescriptionPages which have duplicate meta descriptions.Update duplicate meta descriptions as necessary, so important pages contain a unique and descriptive title for users and search engines. If these are duplicate pages, then fix the duplicated pages by linking to a single version, and redirect or use canonicals where appropriate.It's really important to have distinct and unique meta descriptions that communicate the benefits and purpose of each page. If they are duplicate or irrelevant, then they will be ignored by search engines in their snippets.
HighHighClimb OnlineMeta Description Over 115 CharactersPages which have meta descriptions over the configured limit. Characters over this limit might be truncated in Google's search results.Write concise meta descriptions to ensure important words are not truncated in the search results, and not visible to users.
HighHighClimb OnlineMeta Description Below 70 CharactersPages which have meta descriptions below the configured limit.
Consider updating the meta description to take advantage of the space left to include additional benefits, USPs or call to actions to improve click through rates (CTR).This isn't strictly an issue, but an opportunity. There is additional room to communicate benefits, USPs or call to actions.
HighHighClimb OnlineMissing H1Pages which have a missing <h1>, the content is empty or has a whitespace.Ensure important pages have concise, descriptive and unique headings to help users, and enable search engines to score and rank the page for relevant search queries.The <h1> should describe the main title and purpose of the page and are considered to be one of the stronger on-page ranking signals.
HighHighClimb OnlineDuplicate H1Pages which have duplicate <h1>s. It's important to have distinct, unique and useful main headings.Update duplicate <h1>s as necessary, so important pages contain a unique and descriptive <h1> for users and search engines. If these are duplicate pages, then fix the duplicated pages by linking to a single version, and redirect or use canonicals where appropriate.If every page has the same <h1>, then it can make it more challenging for users and the search engines to understand one page from another.
HighHighClimb OnlineH1 Over 70 CharactersPages which have <h1>s over the configured length.
Write concise <h1>s for users, including target keywords where natural for users - without keyword stuffing.
There is no hard limit for characters in an <h1>, however they should be clear and concise for users and long headings might be less helpful.
HighHighClimb OnlineMultiple H1Pages which have multiple <h1>s. While this is not strictly an issue because HTML5 standards allow multiple <h1>s on a page, there are some problems with this modern approach in terms of usability. It's advised to use heading rank (h1-h6) to convey document structure.Consider updating the HTML to include a single <h1> on each page, and utilising the full heading rank between (h2 - h6) for additional headings.The classic HTML4 standard defines there should only be a single <h1> per page, and this is still generally recommended for users and SEO.
HighHighClimb OnlineMissing Alt TextImages that have an alt attribute, but are missing alt text.Include descriptive alt text for images to help users and the search engines understand them better. Where possible, decorative images should be provided using CSS background images or alternatively a null (empty) alt text should be provided (alt="") so that they can be ignored by assistive technologies, such as screen readers.Images should have descriptive alternative text about its purpose, which helps the blind and visually impaired, and the search engines understand it and its relevance to the web page.
HighHighClimb OnlineStructured DataThere are 2 structured data found on the website.Implement only one structured dataStructured data is used by search engine to easily identify what the website is about and what nature of business it belongs to. This helps the website rank based on its relevancy in its target industry.