An SEO audit sets the stage for your SEO efforts. It reveals quick wins that drive results and a path to success the organization can rally behind.
Yet, so many SEOs still struggle to audit a site to deliver immediate results or instill long-term confidence in the SEO program.
To help overcome the issue, I created a simple, easy-to-follow SEO audit checklist in the form of a Google Sheets template that delivers high results. In it, I listed every audit check you should perform to drive the search performance forward. Simply click the button below and follow this guide to get started on your website SEO audit.
A couple of notes before we get started:
Although not necessarily required, to really dive in and conduct a comprehensive audit you’ll need an SEO site crawler.
There are plenty of site audit and crawler tools on the market, and in this post I’ll be demonstrating with one of the best: seoClarity’s built-in site crawler technology.
It’s been battle tested on a site with more than 48 million pages, and gives you full access to find SEO issues that plague your site with no artificial limits. That’s no limits on crawl depth, speed, pages crawled …
As you run your crawl through the audit process (and follow along with our checklist!), you’ll gather a list of technical issues. Give each step a grade of pass, fail, or needs improvement.
Through each step you’re spotting how well the site shows in the audit area. Then a natural prioritized list of issues will emerge to help you work toward improving your organic traffic.
But there are other outcomes, too: SEO roadmaps, projects, team member roles, lunch-and-learn summits, goals, execution plans, new tests, and ultimately successes all come from auditing these areas.
In the end you’ll prioritize the fail issues first, followed by the most promising needs improvement issues based on your time and resources.
Let's get started! Open the SEO audit template and follow along for this thorough checklist that's sure to drive better search visibility and ROI for your business.
User Experience Audit | On-Page Audit | Technical Site Audit |
A search engine bot views the robots.txt file before crawling a site. It gives directions on how to crawl (or not crawl) the website, which makes it a good first step in the audit.
For one, it contains instructions about folders or pages to omit as well as other critical instructions. As a good practice, it should also link to the XML sitemap so the bot can find a list of the most important URLs.
You can view the file manually by going to mydomain.com/robots.txt (replace “mydomain” with your site’s URL, of course). Look for commands that might be limiting or even preventing site crawling.
If you have access to an SEO tool to crawl the site, let it loose on the site and be sure to set the user agent to follow instructions given to Googlebot.
This way, if you’re blocking Googlebot via the robots.txt file, the data from the crawl will reflect that with a “403 Forbidden” status code for the URL instead of a “200 OK” status code and the information for the URL.
Google Search Console historically reports URLs where Googlebot is being blocked. seoClarity users can find this in the advanced settings in a Clarity Audits crawl.
Auditing the robots.txt file sometimes offers the lowest of all low-hanging fruit in SEO: the code often found on development sites that block Google from crawling the entire domain (pictured below).
Recommended Reading: 14 Common Issues with the Robots.txt File in SEO (and How to Avoid Them)
This code is sometimes left in place after the site goes live (or carried over from a site in development), which will continue to prevent it from performing in SEO, so it’s quite the find!
Other mistakes include:
Disallow:/something
This rule is correct because it disallows every URL which sits on the root path www.example.com/something
While the mistake would be the too broad regular expression pattern:
Disallow:*something
This rule would disallow every URL which contains ‘something’ that you don’t mean to be blocking, e.g. www.example.com/stuff/a-big-something.
A sitemap contains the list of all pages on the site. Search engines use it to bypass the site’s structure and find the URLs directly.
Recommended Reading: How to Create an XML Sitemap
Your sitemap should reside in the root folder on the server. The most common place to find it directly is at mydomain.com/sitemap.xml or linked to/from the robots.txt file. Otherwise the content management system (CMS) may show the URL if there is one.
Crawl the sitemap URLs to make sure they are free of errors, re-directs and non-canonical URLs (e.g. URLs that have canonical tags to another URL). Submit your XML sitemaps in Google Search Console and investigate any URLs that are not indexed. They’ll likely have an error, re-direct, or non-canonical URL!
SSL encryption establishes a secure connection between the browser and the server. Google Chrome marks secure sites (those having an active SSL certificate) with a padlock image in the address bar.
Recommended Reading: HTTP vs HTTPS: What’s The Difference and Why Should You Care?
It also warns users when they try to access an insecure site.
Most importantly, though, Google also uses the HTTPS encryption as a ranking signal.
Visit the site in Chrome and look at the address bar. Look for the padlock icon to determine whether or not your site uses an SSL connection. You can also test your SSL encryption at ssllabs.com/ssltest/ to ensure it is valid.
seoClarity users can use our built-in crawler to run a crawl and leverage the Parent Page report to find all instances of old internally linked http URLs at scale so they can be updated to the new HTTPs version.
More than half of web searches come from mobile devices. You should really think of your website as mobile-first, that also happens to work on a desktop if someone tries. At this stage of the audit you’re checking that the basic mobile-friendly aspects are in place.
Recommended Reading: Mobile SEO Optimization: 6 Factors That Help Improve Mobile Search Visibility
Select your most important templates, for example a category page, product page, and blog post. Test them with the Google Mobile-Friendly Test. Prioritize issues reported for the development team to fix.
Note: Google announced mobile first indexing of the entire web on their Webmaster Blog in March 2020.
There are many ways to go wrong in mobile SEO. The most common issues arising today come from limiting the mobile experience compared to desktop. Give mobile users a full experience, not just the parts of the desktop site that work OK on mobile.
Other mobile design aspects that come up are:
Page speed is an SEO’s best friend. It’s one of the most critical factors that affects a site’s visibility in Google — and this conversation has only grown more important in recent years with Google’s announcement of Core Web Vitals update and page experience.
In the update, the Core Web Vital metrics (which all relate directly to page speed) combine with other experience metrics to create the page experience signal.
Page speed also has a direct correlation with bounce rate and conversions! It’s often first among the “it’s boring but it really works!” takeaways from SEO practitioner discussions. As a result, optimizing page speed and decreasing load time often delivers instant results to a company’s organic presence and sets the tone for improving the search experience.
Recommended Reading: Page Speed and SEO: How to Improve User Experience and Rankings
Use Google’s Page Speed Insights tool to evaluate key templates on the site. This data is also within the Google Lighthouse data found at web.dev.
seoClarity’s Page Speed Analysis gives a handy point of view by combining all these issues across the site to prioritize the impact. It also allows you to keep track of page speed on a weekly or monthly basis, making it easier to monitor and evaluate your progress.
A few important <head> section tags help Google index the site properly. These exalted tags include:
Without these tags, Google is forced to make assumptions on where to pull content from (title and description) to create the listing, which content among duplicates should be shown to users (canonical tag) and who to show it to (hreflang).
Recommended Reading: How to Write the Perfect SEO Meta Description
Install the Chrome plugin Spark, or manually inspect the code on key landing pages via Inspect in Chrome to spot these tags. Assess whether key SEO tags are present in the <head> section. The Spark plugin will display the data if it’s properly coded.
Also, an seoClarity crawl will collect these issues as shown below. This report will show any potential issues with these tags (e.g. duplication, values that are too long, or are missing).
Utilize and configure each tag properly for every page on the site. At this stage, it’s important that the tags are present and valid on the site.
For the search engines to index and rank a site, they need to crawl its pages first. Google, for example, releases a bot to crawl a site by executing internal links.
Errors, broken pages, overuse of JavaScript, or a complex site architecture might derail the bot from accessing critical content or use up the available crawl budget trying to figure out your site.
Recommended Reading: A Guide to Crawling Enterprise Sites Successfully
Use an SEO crawler to imitate the path taken by the search engine’s bot — an advanced crawler will be able to replicate Googlebot and see your website as the search engine sees it.
Look for reports of crawl issues caused by unnecessary URLs, broken links, redirect chains or incorrect canonical configurations. The right crawler will be fully customizable and allow you to set the crawl depth, speed and frequency. We’re proud to say that our built-in crawler allows for all of that, all with no limitations!
Google Search Console also surfaces crawling errors it has found.
Recommended Reading: SEO Crawlability Issues and How to Find Them
Rendering relates to the way Google sees and displays your page as a document - both the content and the code. You’ll stay safe here by sticking to progressive enhancement principles where the content and core functionality of the site is accessible even from a text-only browser.
Then as users execute CSS and JS files the content is all available to the way Google renders, or views, the page.
For many years Google did not execute JavaScript but since now they do, be sure Google can access all the files needed to create the page just like any other user (i.e. remove any blocking of these files).
Recommended Reading: AngularJS & SEO: How to Optimize AngularJS for Crawling and Indexing
Run a site crawl with JavaScript enabled to render pages exactly as they would appear in the browser. In doing so, you’ll evaluate issues Google might encounter when rendering your pages with their JavaScript crawling capabilities.
Another way to check how well your page renders to Google is to view the cached version of a few important page templates. You can view this after searching for the page and clicking the option next to the URL to view the cached version.
Google tends to store only the HTML of websites in the cache opposed to the JavaScript executed version. If properly executed, the page should still render all important content and SEO elements in the HTML state.
Additionally, the Mobile Friendly Testing tool is known to behave as Google’s headless browser, executing the page’s JavaScript and rendering the page. This is a great SEO tool to test if anything is stopping Google from accessing the content.
The index is where Google stores information about pages it has crawled. It’s also where it selects the content to rank for a particular search query. Google is rapidly expanding, culling, and updating its index of the Internet.
Recommended Reading: 3 Common Search Engine Indexing Problems
A quick search in Google for “site:domain.com” will show indexed pages. You can dig into these and likely find duplicate content and other “over-indexed” pages too.
If you skip to the last page there is typically a “Google has removed duplicates” message that you can click to reveal that will show the pages it has found but consolidated because they think it’s duplicate anyway.
At this point in the audit you know you’re free of “crawl issues.” So to move on from here you want to make sure Google has found your content and has it indexed, ready to show it to searchers.
Indexing is the outcome of crawling and rendering your site properly, with the proper tags, and unique and valuable enough that Google thinks it’s worthwhile. You’re indexed by Google if you can see your URL or site by searching it in Google.
To get a bigger picture, Search Console provides indexation levels. They recently added an Excluded assets section as well. Some pages on the internet aren’t worth indexing and this is where Google will share where that’s the case (so you can work on making it better!).
In seoClarity’s Site Health report, you’ll find the indexability data showing:
Finally, you can review bot activity on the server to see how Googlebot is crawling your site that may explain how your site is being indexed. In seoClarity, you can filter the results by response codes to identify page errors instantly.
You may find that some pages that are not getting crawled simply have no links to them. Improve internal links continuously to ensure the bot can reach pages deep in the site’s architecture as well. With non-indexed pages, check if the content isn’t too thin to warrant indexation.
Indexation problems typically indicate an issue with crawling and rendering. If the site is still under-indexed, then its meta tags and/or content may need to be updated to improve relevancy and demonstrate to Google that it’s worthwhile to index.
The most common mistake though is using the robots=noindex tag inadvertently. We’ve seen more than one example of this tag being mistakenly added to a prized landing page after a development release!
Google then does what it’s told and removes the URL, while the SEO team submits an urgent ticket to get the noindex tag removed and the page re-indexed.
seoClarity users leverage Page Clarity, which checks the URL daily and sends an email alert if this tag is found (hopefully before Googlebot).
Faceted navigation (discussed in step 13) relates to this step as well. Faceted navigation creates an exponential amount of pages for Google to index. If these are not created thoughtfully, several duplicate URLs can be generated — which causes your site to be over-indexed and dilutes the impact of your most important URLs!
This is where the SEO analysis switches hats slightly from technical-minded to content-minded. Your site is in the game, now let’s think about how it’s being played.
In particular, how well it is optimized for relevant keywords? Key areas to audit are how the meta tags, header tags, and body copy are being used to create a great search experience for the target keyword topics.
Before evaluating on-page SEO, conduct thorough keyword research so that you know what phrases various content assets target.
Recommended Reading: 6 Steps to an In-Depth Content Audit That Will Ensure a Traffic Boost
You can do it in a couple of ways:
But you can use this capability to do so much more.
Let’s say that you want to find all pages with video on them to audit their on-page optimization. Simply look for instances of words such as “video,” “YouTube” or “short clip” to access every page featuring a video.
Recommended Reading: Finding Additional Content: Narrow in on Specific Site Features
Title Tags - approximately 70 characters, use target keyword. Learn how to run a title tag test.
Meta Descriptions - approximately 150 characters. Wow the searcher in to clicking through to the site, assure them you have their answer. Find more on writing a meta description. I recommend owning the book “Words that Sell” to give you an idea of how to make your listings stand out in search results.
Headers - Specifically the “H1” tag should be a shortened version of the Title Tag typically, between 2-3 words. H2 tags should be used if they follow the format of the page. H3s and beyond are of less importance but if used should be relevant and used in order.
Body Content - Created to improve the search experience. Write with authority and help solve the searchers problems using the target keywords. Find the target keywords for the page and write to them with authority. seoClarity users can leverage Research Grid for this by finding the keywords where the ranking URLs for target keyword are also ranking.
At this stage of the audit, the goal is to spot a few quick SEO tweaks on pages where the keywords are ranking between 3-8. By doing this you can gain wins and reveal an ongoing workflow to improve these elements for the search experience.
There are a few common issues with these elements.
Consider the information gaps of the person coming in from outside the site. Do you offer all the information and context needed to choose the best available product or service for their needs, beyond stating the keyword?
Do you help them learn about the product or service and make a well-informed buying decision? Can they see what is unique about your offering or information? This is relevant content
Recommended Reading: How to Create Relevant Content to Captivate Your Target Audience
Run a Google search for your target keyword and review the top ranking sites. Why does Google think they’re special enough to rank in the top positions?
Some sites may give extra content (i.e. information that is outside the target keywords but helpful to the search experience.) Or, maybe they use videos, images, and other engaging elements that create a positive search experience.
Providing relevance is all about considering the contextual aspects of the user experience.
Websites that do a better job of meeting the needs of searchers have a better chance of landing on the first page of the search results. Your job as an SEO is to determine what’s causing this behavior and then figure out ways to provide a better result for the visitor overall.
To get this right you must consider the user's search intent. For example, if you’re selling a grill, stating how many burgers it holds might be more important to the user rather than the surface area in inches. This is a better connection to their intent of buying the grill that will feed their family.
seoClarity users can leverage our massive keyword data set (30+ billion keywords!) to understand exactly how their audience searchers. Use Topic Explorer to match key topics based on underlying intent and uncover what the users are looking for.
Plus, get an inside look at the latest search trends so you can create the most relevant content for your audience long before the competition even knows that a query is in demand.
Create a great search experience for your target audience. After all, creating a positive search experience for your users should be a main priority. We’ve put together a complete framework for this that we like to call search experience optimization.
The best search experience is unique and true to your brand and includes well-written content and a sound site structure.
Structured data found at schema.org provides webmasters and SEOs with the ability to give semantic context to elements of the code.
Google, in turn, uses this information to enrich search listings. It’s how Google can show a company’s telephone number, reviews and star ratings, event info and much more on results pages.
This helps attract the user’s attention and boost the organic click-through rate.
However, for schema.org to work, it must be properly executed in the code. As part of the audit process, I recommend reviewing the markup for potential issues and errors, and making a plan for the ideal schema to use for the most important templates.
Recommended Reading: Technical SEO Best Practices: Schema [WEBINAR]
Use the Google’s Rich Results tool to evaluate your schema markup and its eligibility to appear as a rich snippet on the SERP. Just grab a few important pages and enter them in.
Google Search Console also reports on potential issues with Schema. You’ll find the report under the Enhancements section (it shows if you have markup added to the site) and it shows errors, warnings and the number of valid URLs in total.
While Google’s tools can get the job done, you’d have to go page by page — which just isn’t feasible for an enterprise site! You can use an SEO platform to audit schema at scale.
We’ve actually written an article just about this. More on that here: Auditing Schema Markup: Confirming Structured Data's Implementation.
seoClarity’s Schema builder is a Chrome plugin that makes it super easy to apply structured data to your site. You can try it now for free below!
Not including all of the essential data is a common mistake with schema. Audi tools will flag issues as “required” or “recommended” to help you prioritize the fixes.
Sometimes developers will not include all the information in the tag or extra information, such as an author's first name getting cut off or an incomplete product name. After checking a URL with the Google Structured Data Tool, take a moment to read through the values to make sure everything is complete.
Don’t abuse structured data markup. Google is much more aware of manipulation of Structured Data these days and will happily apply a manual action if they feel you are spamming them.
The many ways you can trigger a manual action from Google include:
Recommended Reading: 7 Common Issues with Implementing Structured Data
Faceted navigation helps e-commerce sites expand their reach by strategically creating sub-categories at scale. For example if you’re selling Chicago Bulls hats, you may have both red, black, and white ones.
Most people search “Chicago Bulls Hats” and you have that search experience covered with your category page. With smartly applied faceted navigation you can allow your site to create a page for “White Chicago Bulls Hats,” “Black Chicago Bulls Hats,” and “Red Chicago Bulls Hats.”
You likely already allow a user to filter products like this. Faceted navigation is what allows Google to index those filtered pages so the searcher that knows the color they want can skip that step in their conversion funnel and land directly on your Red Chicago Bulls Hats page.
However, faceted navigation can generate problems, particularly if each filter creates a new URL on the site without matching a demand. Google might pick those up, leading to the search engine crawling unnecessary URLs and indexing duplicate content.
While being very useful to users, obviously it's helpful to have the ability to filter down to the specific thing you want so you don’t have useless pages indexed by Google.
Recommended Reading: Faceted Navigation SEO: Optimize for the Long-Tail Experience
Search different filters and product categories in Google to see if your pages show up in the index. Check to see if you’re creating too many pages by finding the URL pattern of how your site creates the pages, e.g. searching “site:wayfair.com inurl:color=”
This may be the single thing separating the top sites from the pack. The top sites utilize faceted navigation brilliantly. Only pages that align with search volume are offered to be indexed and all of the key On-Page elements mentioned above update to the long-tail target.
The most important elements are that on-page elements update making the “faceted navigation URL” stand on its own, this includes a unique URL, title tag, description tag, and H1 tag unique to that page, as if it were another top level category page on your site.
Every possible combination of facets is typically (at least one) unique URL, faceted navigation can create a few problems for SEO:
We would recommend the following solution: Category, subcategory, and sub-subcategory pages should remain discoverable and indexable.
For each category page, only allow versions with 1 facet selected to be indexed. On pages that have one or more facets selected, all facet links become “nofollow” links.
On pages that have two or more facets selected, a “noindex” tag is added as well in case Google does crawl these pages (perhaps from a direct external link).
Determine which facets could have an SEO benefit (for example, “color” and “brand”) and make sure Google can access them. Essentially, throw them back in the index for SEO purposes.
Ensure your canonical tags and tags are set up appropriately.
For example: a common mistake is having the canonical tag updated to point to the unfaceted version (e.g. a primary category page), which would be a valid setup if you didn’t want Google to index the faceted URL.
Not creating a crawl path for Google to find these URLs, perhaps because the sort/filter functionality is all behind JavaScript execution. Also creating multiple versions of the faceted URL, for example “White Chicago Bulls Hats” and “White Hats - Chicago Bulls,” be sure that if your site can create the same filtered product list that the canonical tag on each point to a single version for your SEO targeting.
Finally, not including these pages in an XML sitemap is a mistake because it misses a chance to tell Google you really mean for these URLs to be indexed and shown to searchers. A crawl of your site will reveal most issues as well because it’ll reveal the URLs and if the proper tags are in place to match the search demand.
A website should support people with physical, cognitive, or technological impairments. Google promotes these principles within their developer recommendations. For SEO, accessibility issues are a combination of rendering issues and relevance laid out above. Google describes accessibility to “mean that the site's content is available, and its functionality can be operated, by literally anyone.”
These elements help these users but it’s also great for everyone, for example having a distinct color contrast ratio, which all users would appreciate.
The Google Lighthouse Plugin (web.dev) does a great job of outlining accessibility issues. It will flag items in the code such as missing alt text on images or names on clickable buttons. It will look for descriptive text on links (no “click here”).
These elements help bring the site to life for those using screen readers. It’s also not hard to see how a site built to these standards would help Google understand the web at scale.
Be on the lookout specifically for: generic or missing link names, missing alt tags, and headers skipping order (e.g. an H2, without an H1).
This is a great opportunity to teach the team about these issues and create an accessibility standard for the site that is great for the search experience for all users and highly encouraged by the gatekeeper of the web, Google.
If you use a crawler like seoClarity’s Clarity Audit, you’ll get notifications on those issues in the site crawl report as well.
Authoritative content is unique to a brand and showcases their expertise on a topic. This works to build trust with the target audience.
This is where you really give the target content a hard look - look for original facts and research. Great content is genuine and written with care. Remember that the reader arrived after doing a search. They have a problem to solve, something to learn, or a fear to squash. Does your content answer the call? Does it calm their fear?
This is also the step where you evaluate the topic clusters for SEO, surrounding your content with blog posts and resources. These assets can target new terms in their own right and give you a chance to showcase authority on a subject and interlink the content.
Evaluate how the site is targeting “awareness” keywords. To see these, filter out Google Search Console queries containing words like “how” (seoClarity users can do it using Search Analytics). Evaluate rankings and performance, looking for low-hanging fruit (pages ranking between positions 11-40, requiring a little push to appear on page one.).
This is also a good step to find content gaps. These are commonly areas where your competition ranks but you don’t — it’s time to create more authoritative content to fill those gaps!
A great way to find them is to find the keywords where three competitors are in a prominent position but you are not. These are likely easy wins for you if you simply showed up and created the content.
You should also do keyword research to determine the opportunities at different intent stages related to the target keyword. For example, if you’re selling running shoes, an article on “how to choose the best running shoes” and “tips for running a marathon” are topics that can expand your authority for the primary terms (running shoes), and help move searchers down the funnel, without leaving your site.
Off-page analysis is a look at everything happening off the website that is impacting SEO, i.e. external links. The quality and quantity of relevant websites sharing and linking to your content is a good sign that your content is worthwhile. It’s the original Google innovation and currency of the web.
After this review of backlinks, you’ll have some specific targets and a good understanding of what drives linking in your industry.
From here my favorite link building tactics are the skyscraper technique and content outreach methods.
After auditing your backlinks, you can develop a plan on how to incorporate these tactics to best your competitors.
The biggest mistake SEOs make with off-page tactics is doing outreach without researching the specific value that may be brought to the person they're contacting. Skipping the competitive research part, which can reveal invaluable nuggets on what is driving their off-page success is another misfire.
Congratulations! You made it through a site audit. Doesn’t that feel great?
You now are an expert on how to tactically improve your website and perform an SEO audit. When performing this site audit in the future or across other sites, remember:
Over time, as you continue to address these areas on your site, you’ll see the “Needs Improvement” notes turn into your strengths as you see the SEO performance improve.
Not sure if you’ve remembered it all? Bookmark our convenient, free site audit checklist to guide you through each step of the process.