Every website I build is created with best practices in technical SEO in mind. However, over years of use, a website can undergo changes or additions and it’s important to revisit the technical SEO on a website to see what areas could be improved. Like a Spring cleaning, a technical SEO audit and repairs can be beneficial to keeping your website optimized and performing at its best so that it can be discovered by your customers through search engines.

What is technical SEO?

To better understand why your website needs to employ all the tactics of technical SEO to get found, it’s helpful to first understand how your website gets listed in search engines. Once a page is published on your website, for example https://mybusiness.com/services, it will not be instantly listed on search engines such as Google. Billions of new web pages are published daily and for a page to be indexed by Google, it must first be “crawled.” This means that Googlebot visits pages on websites hosted on servers throughout the world and updates its index based on what it finds. For a web page to be listed based on a keyword search, for example “best coffee in Cleveland”, the page must meet a variety of criteria within the content of the page that Google considers when determining if or how high to rank the page. Content and other on-site or off-site SEO strategies are beyond the scope of this article, but are the next steps to be conducted after implementing solid technical SEO on your website.

Technical SEO is related to the how your website is developed to be easy to navigate and free of any technical issues that would prevent it from being crawled and indexed by search engines. Think of your website as a public library – it could have all the great books in the world, but if the books aren’t labeled or are simply scattered all over the place or difficult to access, the library is going to be useless.

Technical SEO is just one piece of the whole SEO puzzle, but it is a critical first step and one that should be revisited frequently because it helps search engines crawl your site and rank it in the search results to begin with.

Even prior to beginning work on technical SEO, I encourage reviewing your website’s analytics to first determine where you’re at currently so that you have something to compare your results to. Since most business’ primary goal on their website is to convert leads or sales, it can be helpful to also create and track conversion goals in Google Analytics. Examples of conversions are reaching a particular page, clicking a link, or submitting a form either to send an email or to make a purchase. For more information, watch the video How to set up Goals in Analytics.

Best practices used to improve technical SEO

Website Structure

A website’s structure, how the pages are linked to each other and organized in the navigation, is important to making it easier to use not only by your visitors but also for search engines. The following should be considered when developing your website’s structure:

  • Each URL should use words that are in line with the content on the page, for example mysite.com/how-to-bake-an-apple-pie vs mysite.com/id=1234
  • Pages should be no more than 2-3 clicks away from wherever your visitors are at on the site and to help with navigation should use breadcrumbs – a separate menu that lists the primary page and all levels of subpages following it.
  • Main headings (H1) on your website should indicate the title of each page and sub headings (H2-H5) should indicate the main topics presented on each page. Lower level headings will carry less weight in the meaning of the page than higher level headings.

XML Sitemap

An XML sitemap helps search engines to more easily crawl and understand the structure of your site better. This can help the important pages on your website to be indexed quicker since you are telling Google how to identify and catalog your content instead of it aimlessly wandering though the pages and files on your site.

Robots.txt

Robots.txt files are instructions for search engine robots on how to crawl your website. Every website has a “crawl budget,” or a limited number of pages that can be included in a crawl – so the robots.txt file helps by telling search engines which are the important pages to index and which aren’t.

Image alt tags

While Google is getting better at recognizing the content of images, it is still best practice to include alt tags in the markup for all of the images used on your site. Alt tags are text descriptions for photos or graphics and can also be used for screen readers used by the visually impaired.

Meta tags

Each page on a website can use a Title tag and a Description tag. These extra bits of information that are only shown in the code of a website provide extra information to search engine what your web page is about. Because of this, be sure to include any focus keywords in both of these so that they will include the search terms that you want your page to be found with when people use them in a search.

Structured Data

Structured data helps to give more specific information to Google that can be used for indexing your web pages, resulting in rich snippets displayed in the search results. Rich snippets are Google search results that display extra data, like ratings or reviews. This extra data helps your search listings to stand out from the rest. To see how structured data markup looks, try the Structured Data Markup Helper tool by Google. The following is an example of structured data that could be used on a page that is used to display a recipe for coffee cake:

<script type="application/ld+json">
{
"@context": "https://schema.org/",
"@type": "Recipe",
"name": "Party Coffee Cake",
"author": {
"@type": "Person",
"name": "Mary Stone"
},
"datePublished": "2018-03-10",
"description": "This coffee cake is awesome and perfect for parties.",
"prepTime": "PT20M"
}
</script>

Avoid Duplicate Content

Google aims to only index pages that have unique content. Because of this, if you have 2 or more pages with identical or even similar content, both will be ranked lower than if you only had one. Aside from removing the duplicate content, you can also use what are called “canonical tags” to identify what is the original source of the content and tell Google which should be indexed instead of the others.

Crawl Errors

Broken internal or outbound links, incorrectly implemented redirects, 4xx (not found) or 5xx (server) errors all create not only a bad experience for your visitors, but also for web crawlers. Every page the search bot is trying to index is a spend of crawl budget. With this in mind, if you have many broken links, the bot will waste all of its time to index them and won’t arrive to any of your relevant pages.

Performance

Google rewards websites that load faster. If pages on your website display a lot of images or use other assets like JavaScript or CSS that have large file size, it can slow down the time it takes for a browser to load the page. To mitigate this, there are a number of techniques such as minimizing, deferring, or combining JavaScript and CSS files or optimizing image size while still maintaining quality resolution. Optimizing your website is especially important for mobile since websites often load slower on mobile than they do on desktops.

Security

Google is dedicated to making sure all websites use HTTPS (Hypertext transfer protocol secure) and to encourage this, rank those that do higher than those that don’t. Using HTTPS also ensures that your visitors’ data is encrypted when submitting it from a form on your website so there is no risk to them as a result of hacking or data leaks.

Mobile Friendly

More people search Google using their phone than they do on their desktop, so it’s essential to have a website that is optimized for mobile. The main technique for a website to be mobile-friendly is to use responsive design. This means that the rows and columns that make up the structure of each web page will automatically condense and reorder so that the site is still easily usable when viewed on smaller screens. Other changes that can be set to occur automatically depending on screen size is to display different image or text sizes to better fit a mobile layout or to display a “mobile menu” that displays vertically rather than the typical horizontal menu which may be too wide for a mobile device.

Tools for managing and Auditing technical SEO

There can be a lot to go through on a website to do a thorough technical SEO audit. To help with this, there are several different programs that can be used to automatically perform an audit and present a detailed report that can be used to then fix any of the issues that it finds.

Google Search Console

One way to ensure that Google has correctly crawled and then indexed your content is to leverage a free Google tool called Google Search Console.

Once you create your free account, you can do several things like monitor when a new webpage you created is indexed. You can also submit sitemaps, making it easier for Google to identify and correctly catalog your site’s content.

Screaming Frog

Screaming Frog is free software that can be used to help with finding duplicate page titles and descriptions, as well as examining URL structures to determine what needs to be fixed.

Page Speed Insights

Google’s PageSpeed Insights tool analyzes a specific page’s site speed and user experience and presents a list of recommendations for areas that can be optimized.

Mobile-Friendly Testing Tool

Google’s Mobile-Friendly Test assesses how well your website is optimized for mobile which has now become an important ranking factor.