We’ve all been there, whether we’ve been an SEO for weeks or decades — staring at a website, knowing there are issues slowing it down and keeping it from the top of results pages. Just looking at the site, you can tick off a few changes that need to be made: Perhaps the title tag on the homepage doesn’t follow SEO best practices, or the navigation looks like something you need both hands to maneuver.
A technical SEO audit isn’t easy; it’s a puzzle with lots of evolving pieces. The first time you face an audit, it can seem like there’s just too much to do. That’s why we’ve put together this detailed guide.
Follow these steps to run a technical audit using Semrush tools. It will help beginners, especially, who want to reference a step-by-step guide to make sure they don’t miss anything major. We’ve broken the process into 15 steps so you can check them off as you go.
When you perform a technical SEO audit, you want to check and address issues with:
- 1. How To Spot and Fix Indexation and Crawlability Issues
- 2. How To Address Common Site Architecture Issues
- 3. How To Audit Canonical Tags and Correct Issues
- 4. How To Fix Internal Linking Issues on Your Site
- 5. How To Check for and Fix Security Issues
- 6. How To Improve Site Speed
- 7. How To Discover the Most Common Mobile-Friendliness Issues
- 8. How To Spot and Fix the Most Common Code Issues
- 9. How To Spot and Fix Duplicate Content Issues
- 10. How To Find and Fix Redirect Errors
- 11. Log File Analysis
- 12. On-Page SEO
- 13. International SEO
- 14. Local SEO
- 15. Additional Tips
Semrush’s Site Audit Tool should be a major player in your audit. With this tool, you can scan a website and receive a thorough diagnosis of your website’s health. There are other tools, including the Google Search Console, that you’ll need to use as well.
Let’s get started.
1. How To Spot and Fix Indexation and Crawlability Issues
First, we want to make sure Google and other search engines can properly crawl and index your website. This can be done by checking:
- The Site Audit Tool
- The robots.txt file
- The sitemaps
- Subdomains
- Indexed versus submitted pages
Additionally, you’ll want to check canonical tags and the meta robots tag. You’ll find more about canonical tags in section three and meta robots in section eight.
Semrush’s Site Audit Tool
The Site Audit Tool scans your website and provides data about all the pages it’s able to crawl, including how many have issues, the number of redirects, the number of blocked pages, overall site performance, crawlability, and more. The report it generates will help you find a large number of technical SEO issues.
robots.txt
Check your robots.txt file in the root folder of the site: https://domain.com/robots.txt. You can use a validation tool online to find out if the robots.txt file is blocking pages that should be crawled. We’ll cover this file — and what to do with it — in the next section on site architecture issues.
Sitemap
Sitemaps come in two main flavors: HTML and XML.
- An HTML sitemap is written so people can understand a site’s architecture and easily find pages.
- An XML sitemap is specifically for search engines: It guides the spider so the search engine can crawl a website properly.
It’s important to ensure all indexable pages are submitted in the XML sitemap. If you’re experiencing crawling or indexing errors, inspect your XML sitemap to make sure it’s correct and valid.
Like the robots.txt file, you’ll likely find an XML sitemap in the root folder:
https://domain.com/sitemap.xml
If it’s not there, a browser extension can help you find it, or you can use the following Google search commands:
- site:domain.com inurl:sitemap
- site:domain.com filetype:xml
- site:domain.com ext:xml
If there’s no XML sitemap, you need to create one. If the existing one has errors, you’ll need to address your site architecture. We’ll detail how to tackle sitemap issues in the next section.
To fix crawlability and indexing issues, find or create your sitemap and make sure it has been submitted to Google.
Submitting your sitemap means posting it on your website in an accessible location (not gated by a login or other page), then entering the sitemap’s URL in the Sitemaps report in Google Search Console and clicking “Submit.”
Check the Sitemap report in Google Search Console to find out if a sitemap has been submitted, when it was last read, and the status of the submission or crawl.
Your goal is for the Sitemap report to show a status of “Success.” Two other potential results, “Has errors” and “Couldn’t fetch,” indicate a problem.
Subdomains
In this step, you’re verifying your subdomains, which you can check by doing a Google search:
site:domain.com -www.
Note the subdomains and how many indexed pages exist for each subdomain. You want to check to see if any pages are exact duplicates or overly similar to your main domain. This step also allows you to see if there are any subdomains that shouldn’t be indexed.
Indexed Versus Submitted Pages
In the Google search bar, enter:
site:domain.com
or
site:www.domain.com
In this step, you’re making sure the number of indexed pages is close to the number of submitted pages in the sitemap.
What To Do with What You Find
The issues and errors you find when you check for crawlability and indexability can be put into one of two categories, depending on your skill level:
- Issues you can fix on your own
- Issues a developer or system administrator will need to help you fix
A number of the issues you can fix, especially those related to site architecture, are explained below. Consider the following two guides for more in-depth information:
- For manageable fixes to some of the most common crawlability issues, read “How to Fix Crawlability Issues: 18 Ways to Improve SEO.”
- If you’d like more information on the specifics of crawling and indexability, check out “What are Crawlability and Indexability: How Do They Affect SEO?”
2. How To Address Common Site Architecture Issues
You ran the Site Audit report, and you have the robots.txt file and sitemaps. With these in hand, you can start fixing some of the biggest site architecture mistakes.
-
Site Structure
Site structure is how a website is organized. “A good site structure groups content and makes pages easy to reach in as few clicks as possible.” It’s logical and easily expanded as the website grows. Six signs of a well-planned and structured website:
- It takes only a few clicks (ideally three) for a user to find the page they want from the homepage.
- Navigation menus make sense and improve the user experience.
- Pages and content are grouped topically and in a logical way.
- URL structures are consistent.
- Each page shows breadcrumbs. You have a few types of breadcrumbs to choose from, but the point is to help website users see how they’ve navigated to the page they’re on.
- Internal links help users make their way through the site in an organic way.
It’s harder to navigate a site with messy architecture. Conversely, when a website is structured well and uses the elements listed above, both your users and SEO efforts benefit.
Site Hierarchy
When it takes 15 clicks to reach a page from the homepage, your site’s hierarchy is too deep. Search engines consider pages deeper in the hierarchy to be less important or relevant.
Conduct an analysis to regroup pages based on keywords and try to flatten the hierarchy. Making these types of changes will likely change URLs and their structures and may also affect the navigation menus to reflect new top-level categories.Read more…..
Source: How To Perform a Technical SEO Audit in 15 Steps
.
.
















