Google has made it incredibly easy for all of us to ask a question and have it immediately answered. In less than an eighth of a second, a query is made and the result is returned.
What may look easy to us, however, is enormously complex in the background. When we conduct a Google search, Google is not searching the web for an answer in the moment.
Rather, Google has already crawled the web and has built a database that it’s looking up.
Google uses a web spider called Googlebot that crawls the web to discover new web pages, which it then adds to Google’s index. Essentially, crawling is the search engine’s process of discovering pages on the web.
Recommended Reading: SEO Crawlability Issues and How to Find Them
Google then takes the newly discovered web pages and tries to learn what is on each page.
After this process, known as indexing, Google stores this information in its index – a huge database storing the pages that it has crawled and learned about.
So when you Google something, Google returns all relevant pages from its index. Google’s ranking algorithm does its best to provide the end user the best and most relevant results from the millions of indexed pages.
Considering the sheer magnitude of what’s already out there online, how do you get your site crawled and your content indexed? Google has to prioritize its efforts, after all.
To see your site in search results, and then see your site ranking for your target terms, you must have the right technical SEO basics in place and the most important SEO elements optimized.
Let’s start our checklist of technical SEO best practices with the most important aspects: crawling and indexing.
Recommended Reading: 3 Common Search Engine Indexing Problems
Google has to prioritize which web pages it crawls. The web is a big place, but not all websites meet Google’s standards – some websites have low-quality, spam, or stub pages that Google does not want to crawl or index.
And while the purpose of SEO is to ensure that your site gets crawled and indexed at minimum, the ideal goal is to optimize for ranking factors to increase search visibility so your site ranks on the SERP.
To find issues with your site and address areas to improve upon, which makes Google’s life easier, you could execute an SEO audit.
This will reveal any technical issues keeping Google from crawling and indexing your site, and alert you to technical optimizations or content that requires further improvement.
You’ll end up with a to-do list of SEO factors to improve. The challenge, however, lies in prioritizing those SEO issues.
Recommended reading: What is a Site Audit?
Technical obstacles are some of the biggest challenges that brands face with their SEO efforts, and they are also arguably the most important.
If your site does not have good usability, all your other SEO efforts and content updates are worthless.
Luckily, there is a way to plan out your strategy for tackling technical and usability issues, all to ensure that Google can crawl and index your site.
Recommended Reading: 15 Common Technical SEO Issues and How to Solve Them
The basics are that you’re not blocking Google from accessing your site and that you’re making it easy for Google to find and index your content. The following are areas to evaluate your site’s crawlability and indexability, and should be your priorities.
First, it’s important to have a sitemap that includes all important URLs. Sitemaps allow Googlebot to see the pages you deem most important.
Note: Google points out that having an XML sitemap that reflects valid site URLs is not a guarantee for having your site crawled, but you are most likely to benefit from having one as it guarantees that Google at least knows the pages are there.
Sitemaps are especially useful for large websites, where it can be hard to keep up with internal linking strategies to count on Google finding pages while crawling the site.
Do note, having a sound interlinking structure is another way to ensure Google navigates your site properly. You’ll want to give more links to your most important pages.
This creates a clear page hierarchy. Remember to use relevant anchor text that gives context to where the link leads.
Although outside of your direct control, showing Google that other people think you have quality content helps crawlability. When external links, also known as backlinks, point to your website, Google picks up on the importance and relevance of your site.
(Link building also demonstrates your content's authority, which will help with search visibility – but content is for after you've prioritized your technical SEO!)
Another influence on crawlability is Robots.txt. This file tells a bot what pages to crawl versus not to crawl.
You need to grant Google access to your pages if you ever want them to be indexed, so you’ll want to make sure that the Robots.txt file isn’t blocking the content that’s intended for the index.
Recommended Reading: 14 Common Issues with the Robots.txt File in SEO (and How to Avoid Them)
We also recommend you serve pages on HTTPS and implement a site-wide rewrite to HTTPS URLs.
This way, any visitor – including Googlebot – that reaches an HTTP URL will be automatically redirected to the HTTPS version of the site with a server-side redirect.
Also pay attention to (and address) broken links, multiple redirects, and slow server response, which all negatively impact your crawlability.
Crawlers have budgets, and would rather spend time on sites that are fast, speedy, and error-free. When sites have these errors, Google realizes that it’s going to be a troublesome site to work with, and it’s simply not worth their time.
Now that the site is crawlable, we can turn to technical on-page SEO that will make Google’s life a little easier.
After you’ve prioritized the above technical SEO basics to have your pages crawled and indexed, it’s also important to focus on other areas of technical SEO to ensure your site offers a good user experience.
You want your pages to have a title tag, H1 tag, and an optimized meta description. These metadata work to provide Google and the end user with more context about the contents of your site.
In Google’s “How Search Works” post, they explain that if keywords appear in a page’s body text or title, that page is more likely to be relevant to the searcher’s query.
As we’ve seen, Google works hard to understand the contents of a page.
You can include structured data to provide explicit indications about the meaning of a page and help Google in the process – using schema lets you make Google’s life easier.
For a comprehensive look at structured data and schema, review our schema webinar (with transcription) here: Technical SEO Best Practices: Schema [WEBINAR].
Finally, you’ll want error pages to return 404 error status codes. To allow Google to properly index your content, your site must have return status codes that match the page.
Optimizing the site for technical SEO can also mean a faster page load time, since bots and users alike get frustrated when a page loads slowly.
Slow pages mean bots having a bad crawl experience and users returning to the SERP due to a negative user experience. To get a faster loading site, ensure the following are true for your site:
Recommended Reading: Your Guide to Optimize On-Page SEO
Following these technical SEO steps in priority order will allow you to – most likely – have your pages crawled and added to Google’s index so they start to show up in search results.
But as we’ve seen, technical SEO optimization is not just important for crawlability and indexability, it also is connected to the user experience.
Not only do you want people to find your site, you also want them to have a good experience on the site (e.g. quick page speed, good flow of information with title tags, etc.) so that they not only stay on the page, but convert.
After you've completed the above steps, you can turn your attention to keyword research, SEO content writing, and other content-centric elements that will further enhance your web pages so you can properly address users' queries and build up your search visibility.