9 Steps to Perform a Technical SEO Audit for a Client’s Blog
SEO is one of the number one factors that determine the success of a website. We utilize content blogs to boost our SEO strategies and rank for keywords relevant to our offerings.
While your agency already knows the power of SEO and content, you need to perform regular technical SEO audits for your client’s blogs. Doing so ensures that the website is free of problematic errors that hinder the user experience and negatively impact search rankings.
In this article, we’ll review what goes into technical SEO, how to perform a technical SEO audit, and how often you should provide one for your client’s blog.
What is a Technical SEO Audit?
A technical SEO audit is a process of diagnosing a website’s health regarding your quest to improve organic rankings. The auditor will use a variety of tools to crawl existing web pages, verify the sitemap, and look for errors related to navigation, page loading speed, and user-friendliness.
Performing a technical SEO audit regularly is essential as Google’s algorithm is always changing. That means that you’ll want to implement minor or major changes to your SEO strategy depending on the severity of algorithm updates. Website health issues can also pop up over time as you make changes to existing pages or add new ones to the site navigation.
How Often Should You Perform a Technical SEO Audit?
The frequency of your technical SEO audits depends largely on the size of your client’s website. Smaller websites with a more condensed blog can get by with a comprehensive audit once every six months. As the blog grows and the website expands, you may want to consider completing tasks on your technical SEO audit checklist once every three months.
The reason for this is that the larger the client’s website is, the more possibilities there are to encounter new SEO-related errors that can impact performance. The more errors you discover through your audit, the more time it takes to properly address each one.
How to Perform a Technical SEO Audit?
With the purpose of a technical SEO audit fresh in your mind, let’s take a look at what we specifically need to accomplish to perform a successful audit of your client’s blog.
1. Crawl the Entire Website
Though your main focus is the blog, it will benefit you to crawl the entire website at the same time.
In case you’re unfamiliar, web crawling is the process of navigating or “crawling” a website, downloading the information, and indexing it on the world wide web. Software used to crawl a website is generally referred to as a web crawler or a spider.
Crawling a website will not only assist you with the above actions but will also inform you of notable errors found on the website. It may also highlight areas for optimization that can further improve the site’s user experience and discoverability. This is the primary reason that every technical SEO audit should begin with a fresh crawl.
However, you should be aware that Google only offers the domain a limited crawl budget each month. Google’s spider bot will perform a crawl of your website naturally over time, but additional crawl requests will use up the budget. How much of your budget it uses depends on the number of pages you request it to crawl.
With a fresh diagnosis of the website’s health on hand, you now have a list of actionable errors you can present to the client and the team for future SEO recommendations.
2. Set Up a Website Mirror for Testing Changes
Though setting up a mirror of the client’s website is not a necessity, it can be useful for several reasons. A mirror site is an identical replica of the site hosted on the client’s domain. Because the mirror site is not being visited, crawled, or indexed, you are free to test your SEO optimizations without fear.
Secondly, it allows you to experiment with changes and optimizations without taking the primary website offline for an extended period.
However, if you choose to utilize a mirror site, you need to be wary of how this can affect the site’s SEO. If Google happens to crawl two identical versions of the same site, it will greatly impair the domain authority due to all of the duplicate content.
You can combat this in two ways. The first is to always utilize the appropriate canonical tags on the client’s website. Using these tags will help you designate the primary version of the site so that Google’s crawler understands the difference between the two. It will prevent the crawler from navigating the mirror site so that you don’t use up the client’s crawl budget.
The alternative is to disallow crawling of any pages you’re working on in the site’s Robots.txt file. The Robots.txt effectively tells the crawler which pages to crawl and which to avoid according to your set parameters. Then, after you’ve tested and implemented your changes, you can once again allow crawling of those pages once more.
3. Review the XML Sitemap
An XML Sitemap is a file that informs web crawlers of the most important pages found on a client’s website. Not only can you highlight specific pages, but you’ll also be helping the crawler understand the site’s overall structure. This is especially important as you add more blog pages and increase the website size.
According to Google, a sitemap is limited to 50MB in size and must not contain more than 50,000 URLs. While these guidelines likely won’t impact you or any of your clients, it’s important to keep them in mind when planning your map.
You want to make sure that the sitemap is constantly updated with new website pages. Each time you make additions to the client’s blog, you’ll want to incorporate those URLs into the map, especially if no other internal links are pointing to those pages.
To get an inventory of errors found within the client’s sitemap, you’ll want to use Google Search Console. This tool is a godsend for your SEO and will assist with crawl requests, site health reports, and metrics regarding site performance.
You’re going to be looking for common errors such as unexpected 404-page responses, 401 unauthorized requests, or pages being blocked by robots.txt. If your sitemap has too many errors, the crawler will eventually abandon the crawl altogether.
4. Perform a Link-Building Audit
This step will go hand-in-hand with your sitemap review. Google Search Console will also report any broken links found on any of the client’s web pages. This applies to both internal links throughout the site and external links pointing to other domains.
Broken links that lead to 404s not only negatively impact the site’s SEO but can cause the search crawler to abandon the crawl. These types of errors can be indicators that you need to update your site with the appropriate redirect protocols or eliminate the link. If you have any outdated links pointing to external sources, you’ll want to update those links with the correct URL or find an alternate source for your information.
Finally, you’ll want to do a separate audit of your collected backlinks. A backlink refers to any links located on other domains that point back to your domain. You can perform a backlink audit by using tools such as SEMRush and Ahrefs. If you have any backlinks from low-quality domains, you’ll need to reach out to the webmaster and request the removal of the link.
Alternatively, you can submit a disavow request directly to Google by following the instructions they provide here.
5. Monitor Page Response Codes
The status of your web pages is another step that somewhat coincides with the previous steps. As you scanned for errors in your sitemap and Google Search Console, you were likely made aware of page response errors found on the website.
A page response error occurs when a user’s browser requests the server, but the request is unable to be properly addressed. The origin of the page response error depends upon the type of error that occurs.
The most common type of page response error you’re likely to encounter is a 404 – page not found error. This occurs when the user is attempting to access a page on the website no longer exists. It can also occur if the user follows a link with an inaccurately-entered URL. To remedy these, you’ll need to eliminate deleted pages from the sitemap and update any broken links found on existing pages.
You can also alleviate page response errors by effectively utilizing 301 and 302 redirects respectively. A 301 response will redirect a user from a dead URL to an updated one permanently. Likewise, you can utilize a 302 redirect for temporary page moves.
Understanding page response errors allows you to understand how to remedy them. Thankfully, these types of errors are relatively simple to fix and can make a difference in improving your client’s website health.
6. Check for Page Load Speed Errors
Fast page loading speeds are crucial in two ways. First, a slow loading speed is known to drastically increase your bounce rate. According to Pingdom, the average load time for a page is 3.21 seconds. As soon as you jump to 5 seconds, the bounce rate is a staggering 38%.
Secondly, that increased bounce rate and poor UX will also count against the site as a ranking factor. Google values page speed in providing a better experience to its users and will consider loading times when ranking the site against competitors.
Because of this, Google provides us with its PageSpeed Insights tool to check up on our latest website speeds for both desktop and mobile. Use this tool when conducting your technical SEO audit to look for website health errors that are negatively impacting your loading speeds.
7. Don’t Forget About the Mobile User Experience
More users than ever are utilizing their phones for their browsing experiences. Google notices this and agrees that the mobile user experience is essential for a quality website. That’s why they rolled out an update back in 2015 that would boost the ranking of mobile-friendly pages.
For this reason, you’ll want to scrutinize the mobile design of the website in addition to the desktop version. Tools like Google Search Console should identify key areas for improvement, but you can also turn to Google’s Mobile-Friendly Test to learn more about the website.
Numerous factors can negatively impact the website’s mobile UX. Not only will these tests pinpoint the problems, but they will also provide a recommended course of action. Common issues that impact mobile speeds are a lack of image optimizations, lack of a mobile-first website layout, and failure to optimize scripts for mobile hardware.
Most modern CMS platforms provide built-in features that allow you to tailor the website design to both desktop and mobile experiences simultaneously. Be diligent in examining all versions of the website before committing to any changes that might impair your UX on other versions.
8. Check Daily Site Activity Metrics in Google Analytics
Google Analytics assists website owners in tracking daily statistics and creating customized reports regarding user behavior. If your client does not already have their site tagged by Google Analytics, this will be an excellent place for you to start.
As you begin to gather user data, you’ll receive valuable metrics for your audit including daily visits, bounce rate, and average session duration. You’ll even be able to visualize what channels people are using to access the domain.
These metrics can help you better explain the quality of the website to your client. If a particular blog article is getting clicks, but also features a high bounce rate and low session duration, it’s a red flag that the content or layout is not up to snuff. So long as these metrics continue to remain low, the client’s site ranking will continue to fall despite any other technical SEO changes you may implement during the same period.
You’ll want to gather this GA data and present it in the context of an actionable plan. If you see that certain types of blog articles are generating more traction, you’ll want to illustrate the importance of gearing the blog toward what the audience likes. You can also look for opportunities to start implementing an effective link-building strategy within the blog to help more users move from article to article.
9. Review Structured Data
Structured data is a type of language that helps Google better understand the type of content found on a web page. The most common type of structure data used by SEO experts is schema markup. This is a type of code implemented within the website’s HTML to qualify sections of content.
Schema markup can give a huge boost to your SEO as it takes the guesswork out of Googlebot’s job. Without the markup language, the crawler is left to examine the on-page content and do its best to determine what the content is and how it relates to other content on the site. When using the markup, the crawler immediately reads the tag, understands the content, and indexes it correctly.
However, when used incorrectly, schema markup can further confuse search engine crawlers. In your audit, you’ll want to scan for errors in the markup language and create a plan to either fix the page and/or expand upon the existing markup code where applicable.
Presenting Your Technical SEO Audit Findings
If you follow this technical SEO audit checklist, you’ll have a significant amount of information regarding the client’s website at your disposal. Your job now will be to present these findings in the context of improving the site and present an actionable plan. You’ll want to highlight the benefits of executing the plan along with a timeline to expect results.
After your SEO experts tackle the client’s website and blog, you should execute one more website crawl. Ideally, you will now have a significantly improved website report to share with your client. These positive changes may produce some instant results, but will also help to boost blog rankings over time.
However, be aware of additional errors that might pop up following any major website changes. This is not uncommon and the client’s site might require some additional care even after the major changes go live.
SEO is an ongoing process and your website will require ongoing attention as search algorithms continue to evolve. That’s why using a checklist like this to guide your technical SEO audits on a regular schedule is the best way to improve a site’s organic traffic and rankings. The healthier your client’s website is, the more conversions they’ll obtain, and the happier they will be.