If you’re running a website, it’s important to make sure that you’re doing everything you can to optimize it for search engines. Technical SEO is one of the most important aspects of this process, and it can be tricky to get right.
In this follow-up to our Learn SEO: The Complete Guide for Beginners, we will walk you through everything you need to know about technical SEO. We’ll cover topics like crawlability, indexing, and page speed optimization. We’ll also provide tips on how to troubleshoot common issues and fix them.
What is technical SEO?
Technical SEO is the process of optimizing your website for the technical aspects that search engines use to crawl and index it, with the end goal of increasing rankings.
This includes things like your site’s structure, code, and page speed. By improving these technical factors, you can make your site more visible and easier to find in search results.
Why is technical SEO important?
You can have awesome content, but if your website has technical errors, that awesome content you worked hard on will be difficult to find. This is why technical SEO is important.
For one, it can help improve your site’s visibility in search results. If your site is well-optimized, it will be easier for search engines to find and index it. This can lead to higher rankings and more traffic over time.
Additionally, technical SEO can help improve your site’s user experience. Faster loading times and an easy-to-use interface can make a big difference for your visitors.
Is technical SEO difficult?
Technical SEO can seem daunting at first, but it’s actually not that difficult once you get the hang of it.
The most important thing to remember is that technical SEO is all about making your website as easy to find and index as possible.
That means ensuring that your pages are well-structured and free of errors, and providing clear and concise metadata.
It also means taking steps to improve your site’s speed and performance.
While this may sound like a lot of work, the payoff is worth it—technical SEO can help you attract more organic traffic and improve your search engine rankings.
Is it possible to rank a website properly without technical SEO?
In order for a website to rank on the search engines, technical SEO must be implemented.
This means that:
- The website must be designed in such a way that it can be easily found and indexed by the search engine crawlers.
- The website must also have fast loading times and be free of errors.
- The website’s structure and hierarchy should be easy to understand.
So without optimizing the technical factors in your website, it won’t rank in the search engines.
Technical SEO vs on-page SEO
One common misconception is that technical SEO and on-page SEO are the same thing. But there is a big difference between the two.
On-page SEO refers to the optimization of individual pages on your website. This includes things like title tags, meta descriptions, and keyword research.
Technical SEO, on the other hand, refers to the technical aspects of your site that affect its crawlability and indexation. This includes things like site structure, code, and page speed.
So if you’re looking to improve your website’s visibility in search results, be sure to focus on both technical and on-page SEO, and not just one or the other.
Glossary of terms
Before we move forward, here are some terms for quick reference.
Crawling
Search engines detect new and updated content by sending search spiders, which crawl and store information from websites for indexing and retrieval.
Indexing
Search engines index content they find during crawling and store it. As soon as a page is indexed, it is eligible to appear when relevant to a user’s search query.
Retrieval
Search engines search their index database for results relevant to the search query, then retrieve those results and rank them according to what looks to be the most useful to the user.
XML sitemap
An XML sitemap is used to help search engines understand the structure of a website. It’s basically a map that shows search engines what pages are on a site and how they’re related.
This can be especially helpful if a website has a lot of content or if it’s regularly updated.
Robots.txt
The robots.txt is a file used to control which files search engine spiders can access on your website. The file is located in the root directory of your website.
Structured data
Structured data is a specific format for organizing information on a website. This format can be used by search engines to understand the contents of a page, and provide more relevant results to users.
Structured data also makes the SERPs more informative by triggering knowledge panels, featured snippets, and event snippets.
Technical SEO audit tools
Here is a list of the tools you can use for your audit.
Screaming Frog SEO Spider
We covered Screaming Frog extensively in our Learn SEO guide. Basically, it crawls your links and gives you an overview of what’s going on in your website: Your 404s, duplicates, missing metadata, and more.
Download: Learn SEO: The Complete Guide for Beginners
SE Ranking
A paid tool I’ve found to be useful as well for auditing is SE Ranking. The best part for me about this tool is that the report interface is so clean and intuitive. You can check out our review for SE Ranking’s website audit feature here.
Semrush
Semrush has a site audit functionality as well which you can use to check orphan pages, core web vitals, and more.
Further reading: Semrush Core Web Vitals Report Review
Google Search Console
Google Search Console is an indispensable tool for technical SEO. This is where you check and verify URLs, submit your sitemap, and more.
PageSpeed Insights
It’s important that you know how fast your website is and what’s causing it to slow down. PageSpeed Insights is the tool you need for that.
No matter how good your content is, if people just leave your website because of technical issues like slowness, your traffic and engagement will remain low.
How to perform a technical SEO audit (with checklist)
One of the most important aspects of technical SEO is auditing your website. This helps you identify any potential issues that could be holding your site back from ranking higher in search results.
How often should I do a technical SEO audit?
There’s no one-size-fits-all answer to this question. The frequency of your technical SEO audits will depend on the size and complexity of your website, as well as how quickly things change on your site.
That being said, we recommend doing even just a short technical SEO audit once a month, and a more in-depth audit every quarter. This will give you a chance to identify and fix any technical issues before they have a chance to hurt your site’s performance.
Let’s begin.
Review your site structure
Tools to use: Screaming Frog, Semrush
One of the most important things you can do to improve your website’s SEO is to review its structure. This will help you identify any potential issues that could be hindering your site’s performance in search engines.
Website structure includes things like your site’s hierarchy, URL structure, and internal linking.
For businesses with new sites, planning your website’s structure is the first and most important thing to do.
You have to make sure that both visitors and search engines can easily navigate through your website.
A poorly designed site structure can make things confusing for developers, leaving behind orphan pages. And when this happens, you will have to take the extra time and effort to look for these orphan pages and link to them.
To plan your website structure, you can create a simple chart or outline like this example below:
Go as simple with your site structure as you can, so visitors and search engines can easily navigate between your web pages.
There are articles that argue that the most important information on a website should only be a couple of clicks deep (see: the 3-click rule), while the Nielsen Norman Group argued that it’s an arbitrary rule not backed by data.
But in 2018, Google’s John Mueller discussed that it does matter—so a good rule of thumb would be to ensure that the most important information is accessible from the homepage.
Look for orphan pages
As mentioned earlier, you will have to look for orphan pages as part of your audit.
Orphan pages are pages of a website that is not internally linked or has zero links from other pages of your website. This makes it difficult for search engine bots to crawl and index these pages.
Orphan pages may occur for different reasons. It could be old blog posts, old products that are not being sold anymore, old services pages that are not being offered anymore.
While there are some pages that are purposely left out such as testing pages and tags pages, it is critical that you check if there are orphan pages that are still relevant for the users.
Does it affect my SEO?
The answer is both yes and no. The effect of orphan pages in a website’s rankings depends on how you look at it. If a page that is orphan was created to be shown to users and has content that is important to users, it hurts your SEO because crawlers can’t see this page thus it won’t appear in the search results. Users won’t be able to see them either.
However, if a page that is orphan was created for other purposes not related to users such as testing functionalities or testing a new website design, then you can leave these pages as it is.
How to find orphan pages using Screaming Frog
To find orphan pages using Screaming Frog, you have to first make sure that your Google Analytics and Google Search Console accounts are connected.
To do that, under Configuration, scroll down to API access and connect Google Analytics and Google Search Console.
Once you got them connected, make sure that under the General tab of the API window, you select Crawl New URLs Discovered in Google Analytics.
After connecting your GA and GSC accounts, under Configuration, go to Spider, and check Crawl Linked XML Sitemaps. Then check the option Crawl These Sitemaps: and input the URL of your website’s sitemap.
After setting everything up, you could now start crawling your website. Once it’s finished crawling, under Crawl Analysis, click on configure and check the box beside Sitemaps. It will start analyzing the crawl log of your website and will allow you to see the orphan pages.
After the analysis, in the Overview under Sitemaps, you can now see all orphan pages that were crawled by Screaming Frog.
How to find orphan pages using Semrush
You could also find orphan pages by setting up Site Audit in Semrush. If you don’t have a website set up, create a new project first and let Semrush crawl your website.
Once the set up of the project is complete, go to the Site Audit of your website then go to Issues. Under the Notices tab, scroll down to check if orphan pages report is enabled.
If it hasn’t been enabled yet, connect your Google Analytics account in the Site Audit Settings. The process is similar to Screaming Frog. It will prompt you to log in with your Google Account, select the Profile, Property, and View of your selected Website and click Save.
Once you complete the setup, Semrush will automatically collect data from Google Analytics. Unlike Screaming Frog, you don’t have to connect Google Search Console to get orphan pages data in Semrush.
After a few minutes, refresh your browser and check the Issues tab again. Click the drop down menu Select an Issue and you will find Orphaned Pages (Google Analytics) under Notices.
Optimize or scrap?
Once you collected all orphan pages, it is now up to you what to do with these. You could place them inside a Google Sheet.
- If a page is still relevant, label them as ‘optimize’ and find possible pages to link to this page.
- If a page was relevant but now irrelevant such as old products or old services, you could delete them and leave them as 404. No need to redirect these as they don’t carry any link value at all.
- If a page is purposely left out, you could leave them as it is.
Here’s a sample template that you could use:
While orphan pages can be harmless to your website’s overall rankings and SEO value, it could be a critical issue when important pages are left out. Include monitoring of orphan pages in your regular website maintenance audit. Make sure that your website has a healthy site structure and good flow of link juice by internally linking pages to each other.
Fix 404 pages
The 404 error is an HTTP status code that is sent by a website server to the browser if it is unable to find the webpage a user wants to access. It usually displays a message “Page Not Found” to users.
To find them, open your Screaming Frog and input your website URL, then start the crawl.
Then click Client Error (4xx) to see the status codes.
You can opt to compile these and send them over to your team’s web developers. You can also redirect the URLs (301) you believe are still useful to the new pages created to replace them.
Secure your website
Switch to HTTPS by using a Secure Sockets Layer (SSL). SSL encrypts the link between a web server and a browser.
If you’re building a website from scratch, you can add this to your purchase for free depending on the hosting you’re using.
If not, you can ask the web developers assigned to your website to purchase the SSL and activate it. You can also see how to do it by yourself through this in-depth guide by HubSpot.
Generate and submit an XML sitemap
Tools to use: Google Search Console
You can check if your website has a sitemap by entering in the search bar: www.[website]/sitemap.xml.
If your website doesn’t have one yet, you can use a plugin such as the Google Sitemap Generator Plugin by Arne Brachhold. You can also use https://www.xml-sitemaps.com/ to generate your sitemap.
After you generate your sitemap, you need to submit it. To do that, you need to open Google Search Console, then click Sitemaps. You can then add your sitemap URL and submit.
Further reading: Ultimate Sitemap SEO Guide
Check your robots.txt
To verify your website’s robots.txt, you can type in the search bar: www.[website]/robots.txt.
If you don’t have one yet, open your Notepad and copy-paste the following:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://www.[website]/sitemap.xml
This is the default directive found in a basic robots.txt file.
You can add other rules based on your needs.
Once you’re done, save it as a .txt file and upload it to the root directory of your website. You can ask the web developers to do so or you can contact your web hosting service provider to help you out.
Further reading: The Complete Guide to Robots.txt and Noindex Meta Tag
Tools to use: Screaming Frog
Robots meta tags pieces of code used to instruct search engines what to follow, what not to follow, what to index, and what not to index. It is found in the <head> section of your webpage.
They usually look like this:
<meta name=”robots” content=”noindex”>
You can opt to press CTRL+U on an individual page to view the page source and check for the meta tag, or you can use Screaming Frog to check the directives in your website.
For example, /category/marketing/ has a directive of noindex, follow. Let’s look at it through the page source:
There you go.
Further reading: What Meta Robots Tag Are For
Optimize URL slugs
Next, optimize your URL slugs.
When you create a slug, make sure that it contains your keyword, and that it’s as short yet descriptive as possible.
So instead of using /seo-in-the-philippines/ as our slug, we just use /seo-philippines/.
It’s much easier to understand and remember. It’s also a good web accessibility practice.
If the webpage already exists, you can still change the URL—but make sure to redirect the old URL to the new one. In our case, we use the 301 Redirects plugin, so our redirect settings look like this:
Provide clear and concise metadata
Meta tags are snippets of information that describe a page content—they appear in the source code, not on the page. Meta tags are essentially little content descriptors that help tell search engines what a web page is about.
According to WordStream, “The “meta” stands for “metadata,” which is the kind of data these tags provide—data about the data on your page.”
To optimize your meta tags, you can opt to change your title tags and meta descriptions—and of course, include your keyword.
A rule of thumb for title tags is to keep them between 50-60 characters.
For meta descriptions, keep them under 155 characters.
When you have a compelling and informative title tag and description, the likelihood that users will click on your page will increase.
And when Google notices you’re getting more clicks through your page, it will tag you as a relevant site, thus making you rank higher.
Further reading: INFOGRAPH: 5 Steps to Create the Perfect Meta Description
Optimize site speed and performance
Tools to use: PageSpeed Insights (you can also choose Google Lighthouse (Chrome Extension) or Test My Site)
Another important aspect of technical SEO is page speed optimization. Site speed is a ranking factor for both desktop and mobile searches, so it’s important to make sure your pages load quickly.
Do take note that some websites use site speed and page speed interchangeably.
However, some websites say that site speed is different from page speed.
Nevertheless, it’s important that the content on your webpage loads faster. This is to prevent your site visitors from leaving your website and going to your competitors.
Optimizing images
If you use a large image size, it will greatly affect your site speed. In this case, you need to reduce your image size.
The trouble with reducing your image size though is that it also reduces its quality. The good news is that there are plugins or scripts that let you decrease the size of your file without reducing its quality.
To reduce your image size without compromising its quality, simply choose the right combination of your file format and compression type you will use. Ideally, your image’s file format should be in PNG and at a medium compression rate.
You can use the following tools when optimizing images for your webpage:
- Adobe Photoshop
- Gimp
- FileOptimizer
- ImageResizer.com
ImageResizer.com dashboard
Enabling browser caching
The concept of browser caching is straightforward: it takes whatever files you define as files that don’t change often (such as your company logo and website menu) and downloads them once to the visitor’s browser.
This way, they don’t have to be redownloaded every single time they visit your website, making your webpages load much faster.
There are three ways to go about this:
Ask your web hosting provider
You can contact your web hosting provider and have them edit your site’s .htaccess file. That way, you don’t have to touch anything in your website.
Edit the .htaccess file yourself
The header says it all. I wouldn’t recommend this unless you have knowledge on how to troubleshoot in case you make a mistake. If you’re using Yoast, here’s their guide.
Use a plugin
There are various plugins you can choose from that you’ll just download and activate. Quick and easy as pie. Just check with your web developers to confirm if they won’t be causing any issues with other plugins.
Enable Gzip Compression
When a browser loads your website, it requires downloading all the relevant files stored on your server. If your files like HTML, PHP, CSS, and Javascript are too large to load, chances are it will affect your site speed as well.
This is where Gzip comes into play. What it does is that it compresses or reduces the size of your file by up to 30% or less than its original size so your webpage loads faster.
You can use the Gzip Compression Test tool to check if Gzip is enabled on your website, or if you simply want to check your website’s Gzip compression rate.
A word of caution. When you use Gzip, you’re only meant to compress files, not images—or you’ll end up with low-quality, pixelated images on your webpage.
There are many other aspects you need to check when it comes to optimizing your site speed. I’ve only included three in this post. For our in-depth guide to site-speed optimization, check out:
Further reading: Ultimate Guide to Site-Speed Optimization
Also, keep in mind that consistency is key when it comes to site speed optimization. An optimized site speed requires adding scripts or plugins on occasion, so it’s best to always do a site audit every month to keep site downtime from happening.
Fix content issues
If you’re having trouble with technical SEO, there are a few common issues that could be to blame.
One of the most common is duplicate content. This can happen if you have multiple pages with similar or identical content.
To fix this, you’ll need to either apply a 301 redirect or add rel=”canonical” tags to your pages.
Another common issue is thin content. This happens when a page has very little useful content.
Different websites have their own opinions on how long content should be—Yoast recommends >300 words for regular posts or pages, while HubSpot claims that your content length shouldn’t go lower than 2,100 words.
Which one should you follow? I would argue that it’s not exactly a matter of word count, but of content quality.
- Does your blog post go in-depth and answer what your target audience needs to know about the topic?
- Does it match search intent?
I suggest focusing on those instead of word vomiting irrelevant content just to “fix” thin content.
Further reading: How TO NOT Screw up Your Canonical Tags and Search Intent SEO for Beginners
Implement structured data
According to Yoast,
“Where Schema is the language in which you present your content, structured data is the actual data you provide. It describes the content on your page and what actions site visitors can perform with this content. This is the input you give search engines to get a better understanding of your pages.”
Simply put, it helps make your page more understandable for search engines and users.
To implement structured data, there are a number of things you can do, but I would suggest using a plugin like Yoast or WPSSO Core.
It gives you the option to select entity types and automatically applies the schema for you instead of you having to manually choose a template and inputting data.
Further reading: How to Create Structured Data Markup for Rich Snippets
Prioritize mobile-friendliness
Source: BroadbandSearch
The mobile friendliness of a website can be measured by how well it is designed and optimized for mobile devices such as smartphones and tablets.
Just looking at the graph above, we can confidently say it’s integral to optimize for mobile.
Install an AMP (Accelerated Mobile Pages) plugin to your website
Instead of having to manually work on optimizing your site for mobile, we recommend installing the AMP for WordPress plugin.
This is what we use here in SEO Hacker, and it’s made our job a lot easier when it comes to optimizing for mobile.
- Install the AMP WordPress plugin.
- Activate the plugin—what it will do is append /amp on all your pages but it won’t redirect visitors to them.
- Edit your .htaccess file—you could use an FTP program to do this. I personally use Filezilla.
- (Optional) Just in case you want to check if your AMP pages are working across the board—in your .htaccess file, paste this code:
RewriteEngine On
RewriteCond %{REQUEST_URI} !/amp$ [NC]
RewriteCond %{HTTP_USER_AGENT} (android|blackberry|googlebot-mobile|iemobile|iphone|ipod|#opera mobile|palmos|webos) [NC]
RewriteRule ^([a-zA-Z0-9-]+)([/]*)$ https://website.com/$1/amp [L,R=302]
Note that you have to change website.com to your site’s domain name. I explicitly made the redirect into a 302 because we don’t want all the link equity to be passed on to your /amp pages since it’s merely an accelerated mobile page version.
- Edit the CSS to make your Accelerated Mobile Pages look and feel more like your site. You can edit the CSS using FTP by going to your wp-content > plugins > AMP > template.php
- Use rel=”canonical” tags to your original pages. Just to be sure to keep anything Panda-related off your back.
That’s it!
You could see that SEO Hacker’s mobile version still look and feel like our desktop page design—without all the fluff.
How can I make AMP work for my non-WordPress site?
You will have to go to the AMP Project’s site and learn how to integrate it via hard-code, hands-on.
If you want to know more about AMP, Moz’s Whiteboard Friday does a swell job on explaining it further:
Verifying your AMP pages on Google Search Console
Once you’ve set up AMP on your website, Google will start crawling and indexing them. In a few days, you should now see that there is an AMP section under Enhancements in your Google Search Console.
Google Search Console will notify you if there are any errors in your AMP pages. Just like any other coverage errors, it is divided into 3; Errors, Valid with Warnings, and Valid. When you first apply AMP on your website, there is a high chance that most of your pages will fall under Valid with Warnings.
Don’t worry, your AMP pages will still be available to users. It’s just that there are missing data that could further enhance your AMP pages such as structured data or image sizes.
Structured Data on AMP pages
Aside from loading speeds and user experience on mobile, adding structured data on your AMP pages also makes them eligible for Rich Results.
If you’re using WordPress, there are plugins that enable structured data on AMP pages. Google highly recommends having structured data on both the original page and the AMP version.
Further reading: The SEO Hacker Mobile Optimization Checklist
How do I know if my technical SEO is working?
There are a number of metrics you can measure to gauge the effectiveness of your technical SEO.
Organic traffic
This measures the number of visitors coming to your site from search engines. If you’re seeing an increase in organic traffic, it’s likely that your technical SEO is working.
Click-through rate (CTR) from SERPs
This measures the percentage of people who click on your listing when it appears in search results. A high CTR indicates that your listing is relevant and appealing to users, and that your technical SEO is working.
You can also check if you’re showing up on the Knowledge Panels and other snippets.
Loading times
If you’ve optimized for page speed correctly, you will notice that your site is a lot faster and smoother than before.
Troubleshooting common technical SEO issues
I asked some folks over at Reddit what they wanted to know about technical SEO, and some of them gave me questions that I wanted to include here.
How to fix unused CSS/Javascripts in WordPress
Unused code can make your website load slower, but thankfully there is a way to at least reduce this issue.
Use WP Rocket
WP Rocket is a plugin you can use to remove unused CSS and delay JavaScript execution.
You can check out and follow this really good tutorial from WPBeginner.
How to fix XML sitemap errors
An XML sitemap is a file that lists all the important pages of your website. They ensure that Google is able to crawl and index your web pages.
XML sitemaps also help search engines understand your site structure.
Essentially, a good XML sitemap tool will greatly benefit your website. However, if it’s not done correctly, you might increase your risk of not getting recognized by Google.
Here are some pitfalls that you need to avoid with XML sitemap:
Submitted URL has crawl issues
This is one of the most common XML sitemap issues you will come across in Google Search Console.
This means that your sitemap has listed a page with a known crawl error, but Google will not tell you exactly what kind of error it was.
You will need to reanalyze your sitemap for any error that is undetected. The most common crawl issue errors are:
- Robots.txt blocking crawlers
- Error pages other than 404, such as “403” forbidden and 401
- Javascript or CSS blocked by search engines
You can address these crawl issues with the 11 steps I outlined earlier.
Then go to Google Search Console and resubmit your sitemap.
Sitemap size error
As we discussed in this post earlier, size matters in SEO. Your sitemap size must NOT:
- Be longer than 50MB
- Contain no more than 1,000 images per URL
- Contain no more than 50,000 URLs
If you have a simple site, your sitemap’s size shouldn’t be an issue.
However, if you have (for example) an eCommerce website that’s growing fast, it’s best that you create separate sitemaps for every 10,000 URLs you have.
Fewer URLs mean fewer crawl issues for you.
I suggest that you take a look at Common XML sitemap errors by Yoast and Polishing a sitemap: fixing errors, weeding out trash pages, and finding hidden gems by SE Ranking for a more in-depth treatment of this issue.
How to handle multiple connected websites
The Redditor who asked this question works as a software developer who put their apps under different <appname.>app websites.
It’s generally not recommended to have multiple domains or websites for the following reasons:
- They can compete for the same keywords.
- They can be expensive.
- It can take a while for you to rank the websites because you’ll be doing a full SEO strategy for each of them.
I recommend to figure out first what the purpose is of your website. For example, is it to showcase your work as a software developer?
In that case, you can put your apps under one domain, then create subdirectories in your website for the various apps you want to feature.
This works especially if your apps are connected to one another, or they generally fall under a specific theme (e.g., productivity tools}.
For example, under HubSpot you have various software you can check out—and they’re all under HubSpot.com
If they are completely different and you want to market them as such (and you have the time and energy to SEO each of them), then you can go ahead and put them in different websites.
Key takeaway
Technical SEO is one of the most important aspects of ranking your website properly. It covers all the behind-the-scenes elements that are necessary for Google and other search engines to understand your site correctly.
Without proper technical SEO, it’s difficult to rank at all, let alone achieve high SERP positions. That’s why we created this guide—to help you audit and troubleshoot any issues on your own website and improve your rankings with these 11 steps.
If you want us to take care of your technical SEO for you, check out our SEO Services Package or contact us today for a free consultation.