If you’re reading this, it means you’ve already got some knowledge of SEO and have taken the step of performing an SEO audit on your website. Now, you’re probably eager to fix those errors and get your site performing even better. Great news: you’re in exactly the right place!
Just as a quick reminder, SEO (Search Engine Optimization) is all about improving your website’s visibility on search engines, making it easier for people to find you. While SEO can sometimes seem like a mysterious or complex process, it doesn’t have to be! With the right guidance, it becomes much more straightforward.
In this article, we’ll walk you through how to fix your website’s SEO issues. We’ll cover everything from common audit mistakes to backlink errors and on-page SEO fixes. So, grab a coffee, settle in, and let’s dive into the world of SEO improvement together.
SEO Audit Mistakes
Think of an SEO audit like a full check-up for your website. By using tools like Semrush, Ahrefs, Google Search Console, SEOQuake, and Ubersuggest, you can uncover the issues that are holding your site back from performing at its best.
These tools not only identify errors but also show you what your competitors are doing to rank higher. Once these errors are spotted, the next step is to correct them. Let’s take a closer look at what to check during an SEO audit and how to fix the issues you find.
Finding and Fixing Broken Links with Google Search Console
Broken links (also known as dead links) are pages that no longer exist, leading to frustrating “Page not found” errors for users. These broken links not only harm your user experience but also negatively impact your SEO rankings.
Thankfully, Google Search Console, a powerful and free tool, can help you detect and fix these broken links. Once your site is registered, use the “Search Error” feature to identify the broken links, and from there, you can either remove them or redirect them to a working page, ensuring your visitors never hit a dead end again.
Checking and Fixing Redirect Errors with Google Search Console (GSC)
To check for redirect errors in Google Search Console (GSC), go to the GSC Indexing report. When you see “Pages with Redirects,” it means some of your pages are not being indexed by Google because they’re redirecting to another URL. This might be intentional, like when your web developer sets it up during a migration to keep important data intact on the new site.
However, if the redirect seems off or the page has important content, there might be an error that needs fixing.
To fix accidental 301 errors in GSC, follow these steps: go to the Pages with Redirects link, remove the redirect URL, inspect the URL, and then submit it for indexing. After that, wait a few hours for Google’s robots to revisit and index the URL.
Checking for Toxic Inbound Links
Since Google Penguin was introduced, keeping an eye on inbound links has been crucial to avoid penalties. A toxic or low-quality link can hurt your site’s search engine rankings. These links often come from outdated SEO tactics used by inexperienced agencies or from spammy websites, and sometimes, even from negative SEO campaigns by competitors trying to harm your Google ranking.
This is why regularly reviewing your backlinks is essential. You can also hire an SEO specialists, like our team at Oshara, to handle it. To spot toxic links, use the Semrush backlink tool. After you’ve found potentially harmful links, use Search Console to verify their toxicity. If confirmed, you have two options:
- Contact the Webmaster: If possible, reach out to the webmaster and ask them to remove the link. If finding contact info is hard, Semrush’s Backlink Audit tool can help.
- Disavow the Link: If you can’t get in touch with the webmaster, you can use the disavow feature in Search Console. But use this feature carefully, and only after trying the first option, since it’s an advanced tool from Google.
Checking and Fixing HTTPS Errors
To check the security of your HTTPS connection, you can use tools like SSL Labs, SSL Checker, Wormly, or SSL Decoder. You can also do a manual test by clicking the padlock icon in your browser’s address bar. Having a secure connection is crucial because it shows users that their data is protected, building trust and improving the user experience.
There are different types of SSL errors, like server issues or problems with local settings. If your connection isn’t secure, users are at risk while browsing your site, which can hurt your SEO. Before you try to fix SSL errors, make sure to:
- Have the Latest Version of Chrome: Updates can solve some security issues.
- Check Time and Date: Make sure the time and date on your device are correct, as wrong settings can cause SSL problems.
- Clear Cache and Cookies: If you’re still having issues, clear your browser’s cache and cookies.
If none of these steps work, reach out to your hosting provider, as the problem could be on the server side. Fixing these errors quickly will improve the trustworthiness of your site for both users and search engines.
Check and Correct the Robots.txt File
The robots.txt file is important because it tells search engines which pages on your site they should not crawl. It helps guide Google and Bing crawlers, showing them which parts of your site are restricted and which are open for crawling.
To check your robots.txt file, use Google Search Console. The robots.txt report will show you any errors, giving you details like the last crawl date, file path, crawl status, and file size. Once you find errors, edit the file rules or fix any issues.
After making changes, request a new crawl from Google. Just click the settings icon (three vertical dots) next to the robots.txt file in your CMS and choose Request Crawl. This ensures that search engines correctly read your robots.txt file, improving the crawling process of your site.
Checking Index Coverage Issues
Google Search Console is an essential tool for SEO, especially with its index coverage tab. This tab gives you important info about which pages Google’s bots have crawled, which haven’t been indexed, and any problems they’ve found. The Coverage tab shows the status of your URLs, including errors, valid pages with warnings, valid pages, and excluded pages.
Focus on the “errors” section because these are the pages Google couldn’t index, meaning they won’t appear in search results. Carefully review the URLs with errors and fix the issues.
There are various reasons for these errors. Let’s go over the common causes and solutions:
- Redirect Issue: Google may run into problems with redirects, like incorrect URLs, redirect loops, or overly long redirects. To fix this, check the URLs and set up the redirects correctly based on best practices.
- Server Issue: If your server keeps returning a 500 error when Google tries to crawl it, even after some time, don’t ignore it. Contact your developer or hosting provider to fix the server issue.
- URL Designated as “Noindex”: A noindex tag means the URL is blocked from being indexed. If you want the page to be indexed, remove the tag. Otherwise, take the URL out of your sitemap.
Other common causes include 404, 401, 403 errors, which are usually simple to resolve. If you encounter a problem that’s tough to fix, feel free to reach out to us.
On-Site Optimization for Better SEO
On-page SEO refers to the practices you use on your website to boost its ranking in search engines. It’s different from off-page SEO, which involves external factors like backlinks. On-page SEO focuses on how you produce and structure content so that it’s easier for Google to crawl and index.
Before diving into on-page SEO, it’s important to check certain basics and fix them if necessary. Here are some key points:
Check Your Website’s Adaptability to Smartphones
Making sure your website works well on smartphones is crucial for providing a good experience for mobile users. Here’s how to check if your site is mobile-friendly:
- Mobile Viewing: Test your site on a smartphone. Does everything display properly without needing to zoom in or scroll sideways? If yes, your site is mobile-friendly.
- Responsiveness Test: Resize your browser window on a computer to see if the content adjusts to different screen sizes.
- Responsive Web Design: Ensure your site uses responsive design, meaning it automatically adapts to different screen sizes for a consistent experience across all devices.
A mobile-friendly site also helps with on-page SEO because mobile users will spend more time on your site if they have a good experience, sending positive signals to Google.
If your site isn’t working well on smartphones, it’s best to contact a web designer to fix the issues. Making your site accessible and easy to use on all devices helps improve its ranking in search engines.
Consolidate Duplicate URLs with Canonical Tag
Sometimes a website may have duplicate pages or multiple links leading to the same page, which confuses Google on which one to show to users. This can result in Google indexing the wrong page.
The solution is to use canonical tags. These tags tell Google which page should be indexed. You can add the canonical tag to your main page, like this:
<link rel=”canonical” href=”https://oshara.ca/services/referencement-naturel-seo/” />
To show the canonical link, you have two options:
- Redirect: Redirect the duplicate URLs to the canonical page. For example, if there are two identical pages, redirect page 2 to page 1.
- Sitemap Inclusion: Add the canonical URL to your sitemap. This helps search engines understand your site structure and index the right pages.
Practical Example:
- Page 1: https://oshara.ca/services/referencement-naturel-seo/
- Page 2: https://oshara.ca/services/referencement-seo/
Page 1 is marked as canonical, so it gets indexed:
<link rel=”canonical” href=”https://oshara.ca/services/referencement-naturel-seo/” />
By using canonical tags, you help Google index the correct pages and improve your content’s relevance in search engines.
Implement Structured Data Schema
Structured data, also called schema markup, is a technical SEO tool that helps your content rank better, improves indexing, and makes it easier for search engines to understand. When bots crawl your page, schema helps them interpret the content more clearly.
For example, if you have a contact page with fields like name, email, and phone number, Google might have trouble understanding that data without schema markup following Schema.org guidelines.
The way you add structured data depends on your CMS, your website’s size, and your audience. Here are three ways to add it:
- Manual Addition of Markup Schema:
- Identify the content you want to structure.
- Manually add the correct schema tags based on Schema.org.
- Make sure the markup is properly inserted into your site’s HTML.
- Using Google’s Markup Helper:
- Google offers a Markup Helper tool to make it easier.
- Visit Google’s Markup Helper and follow the steps to mark up your content.
- Check the markup using Google’s Structured Data Testing Tool.
- Using a WordPress Plugin:
- If you use WordPress, there are plugins available for schema markup.
- Find one that suits your needs and install it.
- Configure it according to your content type.
Adding structured data helps search engines better understand your site’s content, which can boost your site’s visibility in search results.
How to Add Noindex and Nofollow Tag Correctly?
When you add a nofollow tag to a webpage, it stops search engines from indexing the links on that page. This means that the page’s authority won’t pass to any of its links. So, any links on a page with a “nofollow” tag will be ignored by Google and other search engines.
The noindex rule is a value for the Robots tag. You can add it to a website in different ways depending on the site’s technology or the owner’s preference. Here are some popular methods to implement these tags:
Add Noindex Command to an HTML File
The simplest way to add a noindex tag is by placing it in the header of the page you don’t want to be indexed. Just insert the following code into the HTML file of the page you want to hide from search engines:
This noindex robots meta tag tells Google not to include the page in its index. If you still want Google to follow the links on that page, use the noindex, follow attribute.
Adding Robots Noindex Attribute using .htaccess File
Another option is using the .htaccess file to add a noindex tag to your site. This method is useful if you need to apply the rule to multiple pages or entire sections. Just add the “Disallow” rule to the .htaccess file to stop certain subpages or directories from being indexed.
Using the Noindex Tag in CMS Systems
If you use content management systems like WordPress or Joomla, you can manage the noindex tag with special tools. These CMS platforms have plugins that make it easy to add the tag without touching the source code.