While technical SEO can seem quite daunting at the best of times, here is a simple truth: technical SEO will do more for your rankings than any blog post or landing page on your site.
In this article, we put together a technical SEO checklist for small businesses and everything you need to know about this process. In plain English!
However, in order to better understand the intricacies of this checklist, it is just as important to know the difference between SEO and technical SEO, for they are not the same. As you know, search engine optimization (SEO) is a set of tactics and strategies that can increase traffic to your website. In essence, SEO is responsible for how highly a website will rank within search engine algorithms and how easily online users will be able to find the content that you have taken time to optimize. In addition, as we previously argued, nowadays, SEO is more important than any paid campaign if you really want to grow your business over time. But that's just part of the story...
Here is another simple truth: content is king, but technical SEO is the queen. Without the queen by his side, the king will lose the game. It's that simple!
Technical SEO refers to the process of optimizing the structure of a website to help search engines access, crawl and interpret the content on a digital platform. In other words, technical SEO is not concerned with the actual content; it's main focus is how the website is, for lack of a better word, 'Google ready.' Technical SEO focuses on factors such as how your site is coded with Google's algorithms in mind, namely the structure and speed of your site.
As a reminder, as of October 2018, Google started deprioritizing search results for sites that were too slow. Technical SEO is an intricate process that includes attention to missing tags, broken links or SSL certificates. Don't worry, we'll cover each of these critical factors in more detail below!
Although search engines certainly show a preference for large organization and authority websites, technical SEO is an affordable strategy that can help your company stand out from the crowd and capture placement on the coveted first page of Google results for the specific keywords you are targeting. That is to say, this strategy can help small businesses increase their search engine rankings and use strategic steps to compete with larger companies.
Either way, traffic drives everything in the online world and technical SEO can be employed to help small businesses gain more traction in this competition. What's more, technical SEO can provide a solid foundation for any website and ensure that you have the right strategy in place to support the long term success of your business.
User-friendly navigation and an appealing design are essential for attracting online users.
According to recent studies, 72% of online users are more likely to engage if the website has a mobile friendly design. What's more, over 70% of the population in the United States uses their smartphone on a daily basis and anything less than mobile-ready is sure to dissuade these users.
Most importantly, Google rocked the world in 2016 with their massive update which deprioritized all search results coming from sites that were not mobile friendly. They officially warned webmasters of this update in 2015.
Bottomline: it doesn't matter how amazing your content is in 2019. Even if your website has a great design, if it is not mobile optimized it will never show up in Google search results. That is why this technical SEO tip tops our list.
Creating a solid URL structure is very important when it comes to rankings on search engines. Search engines rely on your URL structure to quickly navigate and assess content on your website.
However, if this structure is difficult to understand, Google will assume the very same experience for users and mark the website down accordingly.
For example, you should have a specific keyword for every page and avoid the use of superfluous words like "the' or "and" which are articles and conjunctions.
Secondly, if your blog has a date in it, forget about ever ranking in Google.
Business owners with site URL structure in the following format (you know who you are!): /blog/2019/07/16/this-is-a-keyword.
If your blog has dates in the URL, that's bad for business.
First, Google will assume your blog post is related to something that happened on that date. Secondly, Google rolled out an update in 2018 that gives priority to new content in search results. Without a date in the URL structure, you can return to an article a year later, update and re-publish it on a new date. If you include a date in the URL, your blog will always be associated with the original publication date.
Read this article for the perfect URL structure.
"HTTPS" is a protocol that protects the information or any transactional data on your website.
An SSL certificate is another element highly valued by Google algorithms. SSL certified websites account for more than 50% of page one rankings on search engines.
This certificate is a very affordable necessity for any website that enables any kind of payments.
Google has been prioritizing websites with "HTTPS' certificates in search results since 2014.
If your site does not have an SSL certificate, do it now!
As a website owner, you always want to ensure that traffic is directed to the right pages that are in line with your primary objectives from search engines.
Canonical tags can make sure that your most important content is not overlooked due to similar content on other pages of the website.
But wait, what are canonical tags? If you type in your search bar "websitemagazine.com" or "www.websitemagazine.com" both options redirect to "https://www.websitemagazine.com/."
To a human, there is no difference between these two pages. However, to the Google algorithms these addresses are not the same. If you haven't set up redirects for this, Google will treat each version of your site as separate entities which will hurt all of your SEO efforts.
Canonical tags are hosted in the header of your website and tell Google which version of a page is the default. Used properly, canonical tags can ensure that there are no issues involving duplicate pages or the same content featured on multiple pages.
Search engines try to serve users with content for their specific region.
Language tags are important for this process because they tell Google the precise language being used on a specific page, which enables the search engine to provide exactly the right page to the end user.
More specifically, the hreflang tag is a piece of code that helps Google navigate to the page that best accommodates the language of the end user.
In simple terms, if your website is dedicated to the US market, but it doesn't tell Google "this site is in English", you may never be able to rank organically for any specific keyword, no matter how much SEO include for your site.
It's no use having great content if search engines are unable to crawl this content. Advanced search operators can help identify precisely which pages Google is unable to crawl.
Why is this a problem?
Maybe your website needs to be updated or maybe it's a brand new site and search engines have yet to find it. Or maybe your developer launched your site in production but blocked search engines from crawling your site.
This mistake is more common than you might expect. Indeed, the mistake is so common that Google now offers the option to check whether your site is fully indexed (here). If you have connected your site to Google Console, you can check in real time whether Google can index your entire website.
Navigation is not only hugely important for users but also for the spiders that crawl your website on behalf of search engines. An XML sitemap is a useful assistant with this process,
What is an XML sitemap? It is a list of pages you want Google to crawl on your site. To be clear, you can still rank in Google without an XML sitemap. But, without it, Google may NEVER fully index the entire website.
An XML sitemap is used to simplify this process for everyone and make sure that the most important pages on your website are ranked accordingly on search rankings.
Why? Basically, this is because of the way Google works. Google gets to your website by following a backlink you have received. Then Google starts indexing your current pages and any back link you provide to your internal pages. This process continues until Google has indexed every page on your site with a link pointing at it.
But if any pages on your site are NOT linked to other pages, Google cannot find them.
To fix this issue, Google now allows webmasters to upload a full sitemap with every single page you want to show in search results in order to get full visibility of your website. You can do that here.
Crawl budget refers to the extent to which Google can index your website without degrading the user experience for the actual people visiting your site. In simple terms, when Google sends its automated bots to your site, 'Googlebot is designed to be a good citizen of the web' (source)
So let's say your hosting provider is not giving your site enough bandwidth. That means your site can only handle so many requests from visitors coming to your website before it slows down or, worse, it cuts the connection to your site from new visitors. This is all too common for example around Thanksgiving Day when some retailers' websites "go down".
Now let's say you choose a hosting provider that limits how many users can access your site. A Google bot is comes to your website. If the bot thinks crawling your site will slow down the experience for other users, it will simply abandon its attempt to crawl your site.
If you don't have enough 'crawling budget,' you will suffer when it comes to SEO search results because Google has left your site before completing its attempt to fully index it.
As you may know, Google takes plagiarism very seriously. Although placing the same content in different parts of your website is not so serious, duplicate content can destroy SEO rankings.
You should never have two pages on your website with the same content. Neither should you ever have content on your site that has been published elsewhere.
One obvious way to remedy this problem is by re-writing the duplicate content until every page on your site is 100% unique.
How can you test a page for duplicate content? Simply copy/paste some text from a page and perform a search on Google with this text "in quotes".
At its worse, Google may actually issue a manual penalty against your site for duplicate content. Check out this article from Neil Patel which shows an extreme scenario. In the example given, a PR firm copied the homepage text for a new client and released it as a press release which was picked up by many websites. That site got manually penalized and Google completely de-indexed the site (you could not find it AT all in Google).
In most cases, penalties are not this extreme. Even so, you shouldn't risk any duplication and should take extra steps to make sure your content is 100% unique.
Structured data is the information next to a meta description that enables search engine crawlers to read HTML code. For example, if you perform a search for a marketing agency, you will find the name of the agency along with other information such as the opening hours or online reviews.
You can improve your search rankings by adding structured data markups to make pages on your website more relevant and readable for the crawlers and, especially, to make them show up in local search results.
Accelerated Mobile Pages (AMPs) are very fast mobile pages that have a stripped-down HTML structure. These pages are prioritized for search results on mobile devices.
Search engines like these pages because the format is clear and AMP articles have a higher share rate.
AMPs are also very easy to install. If you have a Wordpress website or similar, you can enable this feature with the click of a button.
So if you want to get an edge over your competitors, simply go to Google and look for AMP plugins which can be installed on your site to render super fast pages for mobile users. A great developer can make your site AMP optimized in a day.
Google wants each user to know his/her exact location when visiting a website. The easiest way to do that is by providing a simple breadcrumbs structure that can help both users and Google know location. Here's an example from our own site:
While not every website uses the breadcrumb trail, this is a very useful strategy for SEO because it helps Google bots understand the site hierarchy.
By the way, there are three main types of breadcrumbs:
Location-based breadcrumbs – Show the structure of pages and help users know how to navigate the site.
Attribute-based breadcrumbs – Show the relationship between products to help visitors change their approach.
Path-based breadcrumbs – Show the path through which the user has come. For example: Home>Blog>Current Page
According to many studies, the faster the page speed, the higher the organic search rankings.
Site loading will often determine whether a user sticks around; search engines pay close attention to this aspect of user behavior.
What's more, improving this speed also lowers bounce rates and increases the rate of engagement.
Google has gone so far as to develop an online tool that tells you how your site is doing from a page performance point of view called PageSpeed Insights. Go to this link and run your website through it.
If you get a score of 50 and above, you don't have to worry about your site performance having a negative impact on your SEO results. But if your score is 1 to 49, it means Google is deprioritizing your content. It's also worth noting that the lower your score is, the worse your results will be. In other words, your search results will be penalized gradually.
The Google tool above will give you a separate diagnostic for mobile and web. You should definitely examine each element individually.
Your website structure may change over time.
Case in point - you read our tip regarding dates in URLs. Then you ask your development team to remove dates from your site.
Changing the URL structure is great, but you also need to make sure you put redirects in place. In the example above, someone who finds your old links can click on them and be redirected to your new page.
What happens if you don't make redirects? Users end up on 404 pages (page not found).
Believe it or not, 404 errors are a distinct red flag for search engines. For this reason, you must ensure that any time you change your site's URL structure you must set up 301 redirects.
As a rule, it's important for users to have access to the most recent version of every page. Anything less is considered a broken link and a black mark for SEO.
You should always try to upload high quality images with a smaller size or number of pixels.
This is the best way to avoid large images slowing down your website.
If it is too late for that, certain tools such as "Smush" might be needed to downsize the current image on the website.
By the way, the Google Page Insights tool we discussed above will tell you exactly what images need to be optimized.
Here's how you know whether your development team is good. Use the Google Page Insights tool to see whether your website was properly coded.
If your site was designed by inexperienced developers, your page "weight" will be high. In other words, your developer didn't correctly code the website (too many lines of code).
To fix this, a good developer will need to minify CSS or JavaScript in various ways. This will decrease the page weight for your site, thus making it faster for the end user.
For instance, removing white space or unnecessary comments on CSS can free up space. You can even convert hundreds of lines of code into a single line.
Needless to say, this is a job for a professional rather than a website owner with no coding experience.
HTTP is a necessary protocol to protect information and transactions. However, reducing the number of HTTP requests to your service can speed up the website.
For example, you can combine certain files in order to reduce the amount of storage on your server.
Alternatively, CSS sprites can optimize images in such a way that they send fewer requests to the server, while inline images save space by embedding the data of an image into the actual page.
Browser caching can speed up your website and you can update the browser cache policy to optimize how this process takes place.
Simply put, a browser cache allows you to set certain properties for every session s that the cache works more efficiently.
For instance, you can set the cache to 'no-cache' which prohibits caching entirely or to 'private' which only enables caching for local usage.
This might seem a little advanced, but this is precisely where an SEO agency can help you.
Antiquated coding practice: don't show a site to an end user until all your elements load in the background.
Many small businesses have websites coded this way.
Expert developers will show a page to the user as soon as the information above the fold is loaded (what you see at first sight on a page, without scrolling). Then the rest of the page (under the fold) is sequentially loaded.
Make sure your site is coded in such a way that the user can see the page as soon as the info above the fold is loaded.
Broken links are links you once used on your website that no longer exist. This happens when you linked to a website that went out of business or to a website that changed its URL structure without setting up redirects for the new pages (tip #14 above).
Many website owners disregard the prospect of broken links, even through the chances of having them is extremely high.
More importantly, search engines hate these links and most certainly penalize websites that link to broken sources.
You should take time to check for broken links and either remove them completely or insert a direct replacement. Tools such as Ahrefs or SEMRush provide 404 reports for any website.
Internal links have a surprisingly high amount of 'juice' and search engines use these links to determine the importance of pages and pieces of content.
Google also appreciates these links because they add to the overall structure and make the site much easier to navigate.
If you want a cheap and easy way to boost SEO rankings, focus on internal linking and improve the navigation on your website in the process.
Here's a simple rule of thumb. Every time you write a blog on your site, always link to at least 2 internal pages (e.g., service pages, industry pages, product pages.) The more you link to internal pages, the more you tell Google 'those pages are really important for my site'.
Every business owner should be signed up for Google Console, previously known as Google Webmaster Tools.
This free tool is the one and only way Google can alert you to an SEO issue with your site. Believe me - at one point or another, there will be an issue affecting your SEO tools.
Google Webmaster tools can really enhance a website and ensure that search engines find and index your content. Whether you need to improve keyword strategy or HTML, these tools can also provide structure to your data, while quickly and efficiently taking care of many other minor SEO issues.
Page structure is everything in SEO these days. Although long-form content is clearly favored by the most recent Google updates, the structure of this content is equally important.
In this sense, you must have the right heading structure with H1, H2, and H3 subheadings in place.
Alt tags on images are also important but as a rule, it's essential to ensure that every piece of content has been optimized for SEO.
As you can see, there are multiple elements and factors to check. This comprehensive checklist should ensure that all the basics are in place.
Also, you should notice from this checklist that nothing happens overnight. Mastering technical SEO is often just as much about patience as putting in the time to make these changes.
Every little effort helps and every technical improvement is sure to deliver results.