53 Point Technical SEO Checklist With Actionable Tutorials

Before we get started here are three things you must know about this checklist.

  1. Every check links to a detailed tutorial. This detailed tutorial covers 4 aspects of an SEO fix – what the fix is about?, Why this fix is important?, How to check for an issue? and How to fix the issue. So you will get the complete solution to your problem within each linked article.
  2. Each of these tutorials has a list of high-quality resources to help you learn more about the topic of concern. Don’t forget to miss them out.
  3. For the purpose of these tutorials, I have used DeepCrawl or ScreamingFrog. You can implement the same checks using other smart crawlers too but the approach will vary.

This is a long read and I suggest you bookmark this article for future reference. I will try to update this list and linked to tutorials every 3 months.

Crawling And Indexation

Are All Invalid Pages and folders Blocked By Robots.Txt? – Robots.txt are directives for search engines to follow a set of rules. It is typically a file which sits in the root folder of your website. When a crawler visits your website it reads the robots.txt file and crawls the website accordingly. Invalid pages and folders are files on your website which are a near duplicate of each other. These are files which don’t offer much value to the user nor does it communicate any particular thing about your website to the search engines. Our job as a webmaster is to block all such invalid pages and folders from search engine access.

Does Your Site Have A Directory With Links To All The Pages? – In SEO terms a site directory is called an HTML Sitemap. A site directory is an HTML page which links to all possible URLs on your website. Depending on the size and structure of your website there might be multiple pages under an HTML sitemap. Not all websites need a site directory. You will need a site directory if your site has thousands of pages. Also, if your site has a large archive of content pages which are not linked from the main site. If you have web pages which are standalone or landing pages and don’t link to each other then an HTML sitemap can solve this problem. If you have a page/pages with a bunch of links to all other pages on your site then you might have an HTML sitemap. Check with your tech team if such a page was developed previously.

Are your dips in crawl rate justified? – Google bot crawls web pages in order to update its index with new and fresh content. This is measured in URLs crawled per second. When Google bot hits your website the rate at which it crawls your website should be optimal in order that your new content gets discovered without causing a load on your server. This can be seen in Pages crawled per day report search console. If you there is erratic unexplained jumps in this graph then you should investigate further.

Have you fixed your 5XX Error pages? – A 5XX error is a general terminology for HTTP status codes which start with the number ‘5’. Examples include 500, 503 and 508. This error is reported by the browser when something has gone wrong at the server end but it does not know what exactly is the reason.

Do all your Error Pages Show 4XX HTTP Status Code? – A soft 404 means that a URL on your site returns a page telling the user that the page does not exist and also a 200-level (success) code to the browser. (In some cases, instead of a “not found” page, it might be a page with little or no usable content–for example, a sparsely populated or empty page.)

Are all your internal Links pointing to 200 OK page? – 200 OK is an HTTP status code. 200 OK pages are termed valid by your web servers. It tells the browser or a crawler that the concerned page is fine and can be served to the user. Internal links are basically how different pages on your website connect with each other. Which is via a hyperlinked text. Broken links appear on a site when a certain page links to 4XX or 5XX error page.

Are your dealing with your expired content right? – Expired content pages are which no longer serve the content which it was previously hosting. It is like an empty shelf in a supermarket. If you don’t handle expired content right it will lead to the bad user experience. A user lands on a page from search engine expecting his answers solved. When she sees a page which doesn’t serve that content it might lead to frustration. The user might be lost while browsing your site and might never find the product she is looking for. From search engine crawler perspective it leads to inefficient crawling. When search engines repeatedly hit 4XX pages or expired content pages which offer no value you are depleting your site’s assigned crawl budget.

Is Google able to render all content on a page? – What Google crawler sees on a page might differ from what a user sees. A badly rendered page can lead to Google missing out on reading valuable content which is part of the page. It might even misunderstand the layout of the page. This could mean a lesser ranking advantage for your page and site.

Are your mirror sites blocked from search engine access? – Mirror sites are usually duplicate versions of your main website. They have a set of pages which are the exact copy of your main website. These set of pages are usually hosted on a separate domain or a subdomain within your site. When mirror sites are accessible to users and search engines it might lead to loss of traffic and revenue of the main site.

Does your site have SEO friendly URL structure? – An SEO friendly URL structure can help your SEO in multiple ways. lean URLs help add to the user experience. It improves search engine rankings and helps acquires more high-quality links and shares.

Does Your Sitemap Follow Protocol? – XML sitemaps help search engines like Google and Bing get a complete list of valid URLs on your site. But XML Sitemaps have to follow a set of rules in order for search engines to understand and crawl then in an optimized manner. For this, you have to make sure your sitemap is following the Sitemap Protocol.

Do you have a clean Sitemap? – A clean sitemap is one which contains only valid URLs which you want search engines to index.  Every website is given a set crawl budget by Google. A well-optimized website uses this limited crawl budget effectively by serving only worthy pages in its sitemap. To do so you must remove pages/URLs which doesn’t serve any purpose.

Did You Submit Your Sitemap To Google And Bing? – When you submit your sitemap to Google and Bing you will get data on if there are any errors in those sitemaps. It informs the webmaster if the crawler is able to access the sitemap or not. The other benefit is that you will get valuable data like indexed to submit ratio. These URLs can be exposed to further investigation if you see the ratios are too low.

Did Your Sitemaps Separately For Each Category? – If your site has thousands of URLs chances are that few of those URLs are not even discovered by Google because of the sheer size of your site. You want to understand which areas of your site are not getting crawled an indexed. For this purpose, you can split your main sitemap and create sitemaps based on different categories or sub-sections on your site. Once you submit each of these sitemaps to search console you will understand which areas of your site have indexation issues.

Does Your Site Have A Video Sitemap? – You can inform Google about all your video content and its location by including them in a video sitemap. This will boost your site’s search engine presence on Google video search.

Are All Your Page Redirects A Single Step Redirect? – A redirect is a hop which a URL takes when a browser asks the server for a certain URL. In simpler terms, it hops from the URL a user asks the alternative destination URL the webmaster wants to serve. Although Google has confirmed that 301 and 302 redirects don’t lose PageRank anymore there are many issues which redirect and redirect chains cause. Redirect chains cause slower load times for the end user and lack of crawl efficiency for the search engine bot.

Have You Used Self Referring Canonical Tag? – The canonical tag is one way of telling search engines that the URL in concern is basically a duplicate of the original page which can be found at this URL. It is a tag which is mentioned within in the <head/> section of the page. A self-referring canonical tag fights this problem by referring to the original URL in its canonical tag even on a tagged URL.

Do Multiple Url Patterns Return Unique Content? – Most websites have a URL pattern for every page type. For instance, your blog post URL template or pattern might be different from category or product page. For a website with multiple sections and page types, the number of URL patterns keeps increasing. Having multiple URL patterns on your site is not the problem. The problem occurs when multiple URL patterns serve the same content. When multiple URL patterns serve same or similar content your site has a risk of having many pages with duplicate content.

Have You Paginated Content With Rel=”Next” Tag? – A simple way of optimizing your paginated series for Google search engine crawler is by using the Rel=”Next” and Rel=”Prev” tag. Using this tag we can tell Google that a certain set of pages belong to a logical sequence of pages. Once this directive is presented to Google it understands the content better and also consolidates the link value and ranking signals of all such pages.

Have You Optimized Faceted Navigation For Search Engines? – Faceted search or navigation is a way of refining a broad category of products into what a user demands. Faceted Navigation, unlike traditional navigation, gives users a choice to discover items based on more than one dimension. An unoptimized Faceted Search can create duplicate pages which can cause crawling and indexation problems.

Is All your crucial content out of an Iframe? – An <iframe> tag is used to embed the contents of a particular web URL on to another page. iFrames are traditionally used to embed an advertisement, widgets or resource links. Using frames on your site could cause search engines issues. Using frames can prevent search engines from finding pages within a Web site.

Have You Implemented Noindex Tag On Content Which Doesn’T Need To Be Indexed? – If you have a site of few hundred pages then you might want have pages which exist with a sole purpose of serving the user. These pages might have very little content or even duplicate content.

Did You Use Canonical Tag Wherever Necessary? – The canonical tag is one way of telling search engines that the URL in concern is basically a duplicate of the original page which can be found at this URL. Once you use a canonical tag Google understand the original source of content and transfers all link equity and ranking power to the original source of content. If you find duplicate content on your site Canonical tag is one of the ways of solving it.

Are you monitoring your website downtime? – Site downtime means your site was inaccessible to the user and search engines for a certain period of time. This could mean the loss of traffic and revenue to your business. If this happens repeatedly search engines might grade your site as low reputation. You should monitor how well your web hosting service is performing. For this, you can use free tools like StatusCake.

Have You Optimized Your “Out Of Stock” Product Pages? – Temporary Out of stock products are basically products which have depleted in your inventory. These products can’t be sold until you replenish the inventory back in stock. A product page with an out of stock item can be a problem. Such pages don’t offer any value to the user. These pages should be optimized to make it more valuable to the user so that he doesn’t bounce off.

On Page Optimization

Are your Title Tags under 60 Characters? – Google usually truncates the characters in titles which are over 60 characters. This means if you have any important details in your title after the 60 character mark they might not be read by the user. This could mean a loss of user clicks.

Do You Have A Schema Markup For Your Site? Is it implemented right? – Schema markup is one way you can give search engines more information on the contents of the page. Using schema enables you to rich information displayed in your listing in search engine results page. This means higher clicks and traffic.

Do All Your Site Pages Have An <H1> Tag? – From SEO point of view having a keyword rich <h1> tag can help you in ranking. Having an H1 tag on the page and optimizing it for search engines is very easy. Hence it should be leveraged thoroughly to gain a ranking advantage over your competition. Apart from core SEO benefits a well-styled h1 with the right font size, font type, and line height can add a lot of value to the aesthetics and user experience of the page.

Do All Your Site Pages Have An Optimized Title Tag? – Titles are an important SEO factor and also a low hanging fruit as both implementation and optimization is easy. When your pages don’t have an optimized title you will miss out on SEO and social media traffic.

Do All Your Site Pages Have An <Meta Description> Tag? – Although having keywords in meta description is not a ranking factor for most search engines a well-crafted meta description can influence ranking indirectly. When you use a meta description which is relevant to the page with the right keywords your listing stands out to the user on the search results page.

Does Your Site Have Unique H1? – When you use the same H1 tag on multiple pages the user who visits each of these pages might feel confused. He or she might not understand the most relevant content for their needs. Ideally, every page you publish on your site should have a unique purpose. And this should be mentioned in the H1 tag.

Does Your Site Have Unique Page Titles? – When you are using the same title tag on multiple pages on your site, search engines usually are undecided on which of those pages is the most relevant to the query a user has typed. It might rank the less relevant page and when users visit the page might bounce off. This leads to loss of revenue.

Does Your Site Have Unique Description? – There are 2 benefits of having a unique meta description. First, the searcher can differentiate between the topics and content of the page and understand what exactly the page is about. Secondly, more the number of duplicate meta descriptions higher the likelihood that your search engine snippet is NOT optimized for search engine traffic.

Do You Use HTTPS Connection? – HTTPs encryption on your site ensures there is secure data transfer between the user’s browser and your server. Google likes website which is secure and uses it as a ranking signal. HTTPs connection is a ranking signal. If you are looking to improve the security of your site then it is suggested that you use HTTPs.

Have you optimized your website for speed? – Page speed is a ranking factor. Faster the website better will be the ranking and user engagement. Slow loading websites have lower conversion rates leading to loss of revenue.

Have You Implemented Hreflang Across Localized Sites? – If your business is based in the USA and you are looking to ship to customers in Spain it is important to have your website served in the Spanish language rather than English. In order for webmasters to help Google search bots to handle such cases appropriately, Google introduced rel=”alternate” hreflang=”x” tag. Using this tag one can tell Google that the page in context is an alternate version of the main website and the language of the page is defined by the HrefLang reference.

Mobile Optimization

Have You Implemented AMP? – AMP is a Google initiative. The AMP Project enables the creation of websites and ads that are consistently fast, beautiful and high-performing across devices and distribution platforms. Although AMP is not a ranking factor it can dramatically improve the speed of your mobile pages leading to better engagement and conversions of your site.

Have You Implemented The Vary-Http Header? – If your mobile site is served on a different URL or has certain elements which are served separately then you need to use a Vary HTTP header. When you use the Vary HTTP header caching systems like internet service providers, browsers, search engines and content delivery networks keep multiple copies of your page in its cache. One copy for each user agent. So the content you deliver is very appropriate for the device from which the content is being accessed from.

If you are using a separate mobile site have you implemented the mobile markup? – To optimize your mobile site you need to first inform Google Bot the relationship between two URLs by <link> tag with rel=”canonical” and rel=”alternate” elements. Secondly, you should detect user-agent strings and redirect them to the appropriate URL correctly. Your server must detect the user agent and serve the desktop or the mobile URL accordingly.

Does your site have any Mobile Usability issues? – If you have verified your site using Google search console you will get access to mobile usability report within search console. This report shows if Google finds any mobile usability issues on your site. Search console reports errors like flash usage, touch elements too close and small font sizes.

Is your mobile site fast? – You can test how fast your mobile website loads with testmysite. Typically a site should load under 3 seconds in order to deliver a good experience. You can see how good or bad your website is performing compared to your industry. This tool also shows you all possible ways you can speed up your mobile site with a downloadable report.

Does your mobile site have any intrusive pop-ups? – If you are using pop-ups which are too intrusive then Google might penalize your website. Your content should be easily accessible to the user when she lands from the mobile search engine results to your site. If you are using such an overlay then you should remove it or change the layout to make it less intrusive to the user.

Make Site Architecture SEO Friendly

Does Your Home Page Have A Link From Every Other Page On Your Site? – Your homepage is the most important page on your website. It should be linked from every other page of your site. Linking to the home page using the navigation or logo is a usability best practice.

Do All Your Pages have at least one internal link to it?  – When certain pages on your site don’t have a link pointing to them they lose out on the chance of being discovered. In certain cases, these pages might get discovered via an external link or sitemap but still, these pages might not be indexed or ranked if they are not given enough internal links. This is because Google passes link equity to the linked to the page and when a page doesn’t have any internal links it will not have enough link equity or PageRank to rank in search engines.

Are all pages within 3 clicks from the HomePage? – Lesser the number of clicks needed to reach the product page flatter is the site architecture. Sometimes the most important high converting product pages are buried deep and this may result in the page not getting enough link juice to rank. One way to improve the ranking of such pages is by promoting them up higher in the hierarchy so that Google passes more link value.

Is it easy to navigate from one page to another? – Now, no matter, where your customer or search engine crawler lands on, he would need a guide or a hand-holder to help her navigate the website.
A good site navigation will help user and crawlers discover and browse the entire website.

Is your site using Breadcrumbs? – Breadcrumbs are little navigational that help people visualize where they are on your site. Breadcrumbs help users navigate the site logically. Search engines use this to understand the informational architecture of your website. Breadcrumbs can also be displayed in listing on search engine results page.

Honourable Mentions For Technical SEO Checklist

Closing this checklist without mentioning other great technical checklists on the web would be an injustice. Here are few great checklists you must follow.

It was hard work compiling this list. It took me weeks to write a tutorial for each of these checks. If you liked this article please share it with your friends and link it from your blog posts.

 

 

After working 7+ years as a digital marketer for startups and large enterprises I quit my job to start EcommerceYogi. Here I share the exact same tactics which I have used to drive millions of users per month to e-commerce stores. Follow me on Linkedin and Twitter to stay connected.