When your business is not ranking as you have expected, you ought to do a thorough technical SEO audit before checking on content optimisations or building links.
Not sure where to start?
Check out this in-depth checklist for the common technical SEO elements. You will learn why they are important and how to resolve any teething issues from the audit.
Contents
The term technical SEO refers to search engine optimisation processes that relate to the technical area of the website.
The goals of technical SEO is to improve crawlability, indexability, and speed of the site. It also shapes the site’s architecture.
Technical SEO may not be as exciting as keyword research, developing a content strategy, or starting a link building campaign. However, your SEO effort will be in wane without proper technical SEO.
Think of technical SEO as the foundation of a house. If a site is not optimised for technical SEO, no amount of creative content, promotions or link building will increase its organic traffic.
We are moving forward in an increasingly fast-paced digital world where users expect a site to load almost instantly.
Google, being the dominant search engine, is committed to serving its users with pages that are optimised for speed. The search engine is also keen to keep pages that are littered broken links or page errors.
Users get frustrated when clicking on a page on Google, only to get a ‘404 not found’ error. Neither are they pleased when served with low-quality thin content.
This is where technical SEO kicks in, to ensure that your website is optimised to create the best user experience. A good technical SEO implementation paves way for on-page and off-page SEO optimisation.
Not all of the technical SEO requires intervention from a web developer. If you have been setting up your website via WordPress or other platforms, you can run through most of the audit and resolve the issues.
Only one version of your website should be accessible at any one time. By default, a site is accessible with or without the ‘www.’ prefix.
For example, both of these URLs will return the same content.
While the pages may appear the same to you, they are treated as different versions by Google. Having different versions of the website may lead to Google treating the pages as duplicate content, or weakening the SEO value of the pages.
The same applies to a site with ‘http’ and ‘https’. Both of the following URLs could return the same content but are different versions on Google.
If nothing is done to address this issue, Google will index the version that it considers the best option.
You will need to ensure that Google only uses one version of the site. It doesn’t matter if you are going for ‘www’ or non-’www’, but you need to have ‘https’ for better SEO ranking.
Once you have decided on the version, you will need to set canonical tags on the pages that you want to redirect.
Here’s an example of a canonical tag.
<link rel="canonical" href="https:/mybusiness.com"/>
The canonical tag, when placed on the homepage of a ‘www’ or ‘http’ version, redirects users to https:/mybusiness.com.
With the canonical tags, Google is clear on which version is to be indexed and displayed on the search results.
Canonical tags are placed on the head of your page’s HTML code. If you are using Yoast Plugin, you can insert the target URL directly into the Canonical URL field.
When you are browsing the internet, you may find that some websites have ‘http’ prefix while others have ‘https’.
HTTP stands for hypertext transfer protocol and HTTPS is the secure version of it. With HTTPS, data that are transmitted between your browser and the web server are encrypted. It prevents the data from being intercepted by malicious 3rd parties.
Sites that deal with financial transactions are required to be protected by HTTPS. In 2014, Google made HTTPS one of the SEO ranking signals. If you are still stuck with HTTP or does not have HTTPS properly optimised, your site may struggle to rank on Google.
In order to implement HTTPS, you will need to install an SSL cert for your site. Usually, this is performed by the web hosting provider.
If your site has an SSL cert installed, you will find a lock icon next to the URL.
Else, you will find a warning ‘not secured’ if your site does not have an SSL cert or it’s not properly configured.
You will need to contact your hosting provider to purchase and install an SSL cert. When the HTTPS version is up, you will need to redirect pages from the old HTTP to the HTTPS. This is to prevent duplicate content from being detected by Google.
Your site architecture is how you organise and structure the pages within it. When properly-optimised, it makes navigation easy for readers and improves crawlability for Google bot.
Good site architecture is grouped according to categories and with the pages interlinked to each other. It’s arranged hierarchically from the home page, with the deepest page being 4 clicks or less from the top.
Conversely, a bad site architecture, which is characterised by non-organised structure, bad interlinking and deeply-nested pages, is bad for SEO. If a reader needs 10 clicks to navigate to a specific page, Google may have trouble crawling it.
To optimise your site’s architecture, you need to place important pages near to the home page. For a business website, these pages usually form the top tier of the architecture.
The 1st level pages are then expanded into product items, landing pages, services description, and articles. They are also linked with their corresponding upper-level pages.
To further assist readers in navigating your site, you can use breadcrumbs. Breadcrumb is a navigational feature that tracks the pages based on the site architecture.
Here’s how breadcrumbs look like in one of our pages.
Users can easily navigate to other pages with the breadcrumbs, instead of clicking through the menus.
Proper URL structure means it’s easy to read for both users and search engine. If it’s not properly optimised, the URL can be unintelligible.
An URL must give a brief idea on the page’s content. It should also contain the page’s primary keyword as the URL structure slightly influences Google SEO.
For example, this URL contains random character, which does not provide information on the photos in the page.
When you are setting up URL structures, there are a few things to keep in mind.
These are what SEO-friendly structures look like.
A good URL structure often reflects the site architecture of the site. The subfolder is reflected in the URL, which helps users and Google understand the context of the content.
Robots.txt is a file hosted on your site that contains instructions for search engine crawlers. It can be used to instruct specific crawlers on which pages are to be crawled or skipped.
An incorrectly set up robots.txt is one of the common technical SEO issues, which lead to low visibility or pages unable to be indexed.
You can find your site’s robots.txt by keying the filename after the domain as follows:
A basic robots.txt that allows all search engine crawler to access all pages will look like this:
Here’s one with specific allows and disallows for the crawlers.
It tells the crawler to avoid pages preceded by /tag/ and /wp-admin/ but allows access to /wp-admin/admin-ajax.php.
You can also specify instructions for specific crawlers like
User-agent: Googlebot
Disallow: /tag/
If you have specific pages or subfolders that are used for testing, it will be a good idea to disallow access in the robots.txt.
More importantly, you don’t want to accidentally block access by Googlebot by setting the wrong instruction in the file.
You can use Google’s robots.txt tester tool in the Search Console to validate the file.
An XML sitemap contains links to important pages and resources on your website. The goal of having an XML sitemap is to enable Google to crawl the pages easily.
If your site has thousands of pages and missing an XML sitemap, there’s a chance that Google may miss out some of the pages. Without an XML site map, some of the pages may not be reflected correctly in the search result.
An XML sitemap should not be confused with an HTML sitemap. The former is created for search engine crawlers while the latter is meant for humans. The HTML sitemap has no effect on crawlability and SEO.
You can find your sitemap on ‘domain.com/sitemap.xml’ or ‘domain.com/sitemap_index.xml’. The parent directory contains the types of resources that are listed in the sitemap.
Clicking into one of the sitemaps will provide an extensive list of links.
Here’s how to optimise your XML sitemap to make the full use of Google’s crawl budget.
You will need to ensure that the sitemap is accessible to Google. Go to Search Console → Sitemaps and enter the URL of the sitemap.
We have mentioned the importance of getting your pages indexed. At the same time, there are pages that should not be indexed.
If you are running an e-commerce business, a particular product may be accessed via different URLs. This can create an issue of duplicate content and you will want to ensure only one version is indexed.
Other pages like privacy policy and terms of services are quite generic. Most businesses copied the content from similar templates. These pages are better kept as no-index to prevent duplicate content issues.
Some pages, such as ‘Thank You’ or ‘Download’ pages, are not meant to be found on the search engine. These pages are part of your marketing funnel, and you don’t want users accessing the download links on the search engine.
The login page of your website’s CMS should not be on the search engine. However, if it’s not specifically no-indexed, there’s a chance that it may appear on the search result. The only login page that should be indexed is one used by your customers.
Pages like tags, categories or authors are also deemed not important unless you are running a news website with popular authors. Else, it will be better to keep them away from Googlebot.
Once you have identified the pages that need to be no-indexed, you will need to add the following line of code into the HEAD section of the page.
The HTML code tells the search engine crawler not to index the particular page.
If you are using Yoast plugin, you can disable indexing without editing the HTML code.
Canonical URLs are used to direct users to the main version of a page. The main purpose of URL canonicalization is to prevent duplicate content from affecting SEO. It also prevents SEO value from being diluted over the multiple versions of the site.
Usually, there are a few versions of your site that can be returned by Google. This is particularly true for the home page.
Google may return ‘www.yourhomepage.com’ or ‘yourhomepage.com’ when someone searches for the company’s name.
Including a canonical URL on the home page tells Google the exact version that should be displayed.
A canonical URL is defined in the HTML code of the page and has the following format.
<link rel="canonical" href="target URL" />
It tells the search engine that the version it should refer to is at the target URL. Even if don’t maintain multiple versions of your site, it is still a best practice to have canonical URL referencing itself.
Here’s how backlinko.com is including a canonical URL to the same page.
When you have changed the URL of a page or moved to a new domain, you will do a 301 redirect. Redirect chains are occurrences of more than one 301 redirects for the URL.
For example, your site shifted from HTTP to HTTPS and you have your first redirects of:
Site A (HTTP) → Site B(HTTPS)
Then, you decided to move the site to a new domain and you will have:
Site A (HTTP) → Site B(HTTPS) → Site C (New Domain)
As you keep making changes to the site, such as amending the URL for certain pages, you will have instances of lengthy redirect chains.
Redirect chains affect SEO in 2 ways:
If you have built high-quality backlinks to the previous URL, they will lose about 15% of value for each redirect.
Therefore, you will want to eliminate redirect chains. In order to do that, you need to identify redirect chains in your website, and you can do so with Screaming Frog.
Once you have identified the chains, you will need to remove that by removing the intermediary URLs from the loop.
The final fix will be:
Site A (HTTP) → Site C (New Domain)
Since Google Panda upgrade in 2011, thin and duplicate content can lead to a loss of ranking or manual penalties.
Rather than measured by word count, thin content is more accurately defined by the value it offers to the users.
Content that is produced through scraper tools or low quality copywriting is thin content. These content do not satisfy the search intent and are pushed down in the search ranking.
Often, thin content is published by spammy sites hoping to game the search engine. Unfortunately, such Black Hat tactic doesn’t work anymore.
If you are not sure if you are producing thin content, check out the existing pages that top the ranking and look at the topics covered.
Remember that it’s not all about word count, but whether the content is written in detail and answers the query satisfactorily.
Another common issue that greatly affects SEO is publishing duplicate content. Having multiple copies of the same content is bad for SEO, as Google struggles to decide which version is more important.
Duplicate content that results from plagiarism carries stiffer penalties.
Whether it’s an exact or partial copy, Google can detect plagiarism and issue manual penalties. Duplicate content may also lead to removal from search engine if Google receives a complaint that you are using publishing other’s content without permission.
You can use tools like Ahref, Seranking or Copyscape to check for duplicate content.
Whenever an incorrect URL is entered, or a page is not found on your site, the user will get an Error 404 page.
Here’s a basic, unoptimised 404 page.
404 errors do not affect SEO directly, as they are codes returned when a page couldn’t be found.
However, some 404 errors may affect SEO ranking. If you have deleted a page that you have previously created backlinks to, then you are missing out the link power.
Despite the ambiguous effect on SEO, you will still want to optimise the 404 pages. Landing on a plain 404 page is bad for user experience and it doesn’t look well for your brand.
Instead, you will want to create a customised 404 page, to capture visitors who’re visiting a deleted page or keying in the wrong URL.
Some optimisation tips for a 404 page include:
Here’s how we have created an optimised 404 page for our website.
There are two ways to find 404 errors in your website using free tools. The first requires access to Google Search Console. Click coverage and you will find alerts on existing 404 errors reported in red.
The 404 errors in Google Search Console are reported when the crawler detects pages that are physically missing from your site.
Another way to check for 404 errors is through Google Analytics. From the dashboard, click Behavior → Site Content → All Pages and search for ‘404’.
You will get a list of visits that result in 404 errors.
Based on the information, you can determine if the missing pages need to be replaced, or have the visitors directed to an optimised 404 page.
Orphan pages are bad news for a technical SEO audit. As the name implies, they are pages that are not linked to internally in one way or another.
As it is, Google is unable to reach the pages from your site. Neither do users who are visiting the site from your home page.
Often, an orphan page is not indexed on Google and even if it is, it may not be updated frequently.
The SEO authority of the website also fails to reach the orphan pages. If you have built a good number of high-quality backlinks to your site, they are meaningless for the orphan pages.
Finding orphan pages is quite tricky. you will need list out all the pages and cross out those that are crawlable.
Tools like Screaming Frog are helpful in identifying crawlable pages. Those that are remaining in the list are orphan pages.
You will then need to build internal links to the orphan pages or consider removing them from the site.
When a user accesses your website, there’s a flurry of activity that involves data transmission between the browser and the server. In some cases, a query is greeted by HTTP errors.
HTTP errors indicate that something is amiss and the query could not be completed successfully. They are represented in codes and here are the common ones.
For 5xx errors, you will need to get your web developer involved. Technical issues that are preventing users from accessing the site need to be resolved.
If you have removed some pages and it results in 404, you can redirect the users to the new pages. Redirection is done by implementing the following:
Google has made page speed one of its ranking signals. If your site is loading too slow, it will affect its visibility.
Search engines aside, a slow-loading page is a guarantee to users hitting on the ‘back’ button. It’s bad for user experience and ultimately, sends conversion rate plummeting.
Based on Google studies, websites should keep page loading speed below 3 seconds to keep bounce rate low.
Page speed is affected by various factors, such as:
You can use Google PageSpeed Insights to find out issues and opportunities for improvement for your site’s page speed.
Here are a few ways to reduce the loading speed of your web pages.
Since 2015, there have been more searches on mobile than desktops. Today, mobile searches amount to 58% of the total queries.
In 2018, Google started prioritizing mobile-friendliness as one of the key ranking factors. It takes into account loading speed and responsiveness of a site on mobile.
If your website is only meant for desktop and has navigational problems on the mobile, it will suffer in terms of visibility. This is because Google now indexes websites based on how it is displayed on mobile.
You can easily check if your website is mobile-friendly with Google Search Console. Click Mobile Usability under Enhancements and you will have a report on any issues for your site.
Here are some ways to get your site up to standard for mobile-friendliness:
When your site is catering to visitors from different region or languages, you will want to use the Hreflang tag.
The Hreflang tag enables the users to be directed to the proper version based on their IP address. For example, a visitor from Indonesia will land on a page with the Indonesian language, even though he or she clicks on a page meant for Singaporean.
Adding the Hreflang tag also helps Google to differentiate the various versions of your site. It prevents duplicate content issues when you have content catering to different countries but written in the same language.
Here’s the basic syntax for the Hreflang tag.
<link rel="alternate" href="https://mybusiness.com" hreflang="en-us" />
The important parameter is the Hreflang value. It should contain a mandatory language value, which is derived from the ISO 639-1 codes, and an optional regional code from the ISO3166-1 table.
If you are catering to visitors in different regions, such as Singapore, Indonesia and Malaysia, you will need to include Hreflang tags for the possible versions:
<link rel="alternate" href="https://mybusiness.com/sg/" hreflang="en-sg" />
<link rel="alternate" href="https://mybusiness.com/id/" hreflang="id-id" />
<link rel="alternate" href="https://mybusiness.com/my/" hreflang="ms-my" />
It’s also a good practice to add a fallback, in case Google couldn’t find a version that matches the language and region.
<link rel="alternate" href="https://mybusiness.com/" hreflang="x-default" />
A few things to note when adding Hreflang tags:
Schema or structured data are HTML codes that help search engines to better understand the context of the page.
With structured data, additional information can be displayed on the search result. The following screenshot shows how a list of events is displayed along with the generic displayed result.
Here’s another website that makes good use of structured data.
The additional information helps your page stand out in the search result. The additional information attracts users to click into the page and leads to higher traffic.
You can set up schema markup manually, but it’s going to be tedious. If you are optimizing schema for SEO, it’s better to use the Structured Data Markup Helper by Google.
The tool allows you to create structured data without coding.
Here’s an example of how we tag our homepage with the Structured Data Markup Helper.
Step 1: Enter the URL of the page and select the type of content. Click Start tagging and an interactive structure data tagging page will be loaded.
Step 2: Click or highlight elements on the page and select the appropriate tag. Try to fill in as many tags as possible.
Step 3: Click Create HTML. By default, the structured data will be generated using the JSON-LD format.
Step 4: Download or save structured data.
You will now need to insert the structured data into the page. If you are using WordPress, you can use Plugins to insert the structured data. Else, get your developer to have them manually inserted into the head of the webpage.
These are helpful tools that will help you to conduct a technical SEO audit.
You will get a good idea of how your website is performing on Google search with this tool. It provides detailed reports on coverage, mobile usability, performance and core web vitals, which are going to be part of Google’s ranking signals.
If your site is facing security issues or penalties, they can be reviewed on Google Search Console. It’s also used for testing robots.txt and uploading the XML sitemap to Google.
It’s a free analytics tool from Google that provides insights into your search traffic. You can filter visitors to your site by the source, demographic and other metrics. It’s a great tool to check for how users are accessing your pages via organic search.
Ahrefs is a comprehensive professional SEO solution that has a powerful site audit tool. It crawls your site and compiles the data for analysis. You can trace issues like HTTP errors, content quality, redirects and more with the custom filters.
Screaming Frog is highly popular amongst SEO professionals. It lets you audit basic SEO factors with the free version.
Paying for a subscription unlocks more functionality like integration with Google Analytics and Search Console. You will also get unlimited crawls and the ability to configure the crawler.
Another favourite amongst the SEO community, SEMrush has a wide range of tools that help with SEO audit. Its powerful crawler is useful in identifying issues like redirect chains, duplicate content, broken links and slow pages.
GTMetric allows you to test your site’s loading speed. It scores the page accordingly and provides a list of factors that may affect loading speed.
Merkle has a nice collection of tools that will help you in the technical SEO audit. You can use the Merkle to validate robots.txt and generate the XML sitemap. Its Mobile SEO tool is very useful in checking potential issues on mobile responsiveness.
This tool allows you to find out if your site’s organic traffic has been affected by Google’s algorithm update. It extracts data from Google Analytics and interpolates with markers that indicate when the updates happen.
Creating schema markups is made simple with this tool from SchemaApp. It allows you to create structured data for your site and generate the JSON-LD code. Instead of manually adding the structured data to the webpages, this tool automatically inserts the code back to your site.
It's possible to execute a technical SEO audit on your own. However, if you are running a business in Singapore, it’s better to engage Heroes of Digital for the job.
As detailed as a technical SEO guide like this, it will not be able to replace the depth of experience that we had. Our team has a combined experience of more than 10 years in matters related to SEO. We have worked with various types of businesses and encountered challenges that are often not covered in common SEO guides.
Instead of going through trial-and-error, you will save time by getting a professional team to do the technical SEO audit. Your time is better spent on the core activities of your business.
Furthermore, our team has a solid track record. We have audited many other companies and provide recommendations that positively affect their search rankings.
Heroes of Digital has always operated in an honest and transparent manner. We maintain an open communication channel and our clients are always kept in the loop of the audit progress.
Our growth as a leading SEO company is supported by loyal clients. Through various projects, we have demonstrated our credibility in delivering consistently high-quality results. We are also known to offer competitive pricing that matches the client’s budget and requirements.
Hesitate no further, get your free proposal today!
Get A Free Proposal