What technical SEO factors can you assess with an audit?

July 31, 2020 by Aimee0
What-technical-SEO-factors-can-you-assess-with-an-audit.jpg?time=1597294221

Getting technical SEO right will give your content a strong foundation to work from, whether you are tailoring pages for the web or mobile. A SEO audit will help you to keep on top of these factors and uncover potential issues. Here are seven things you can assess.

Sitemaps

Sitemaps help to establish the hierarchy of your web site and communicate the structure to search engines so they know where pages are and to make sure they can access them.

You can check whether the sitemap is set up correctly by navigating to the sitemaps section in Google Search Console. Here you will be able to confirm if and when a sitemap has been submitted, the URLs that are being indexed and any potential issues.

Mobile optimisation

Google will switch all websites to its mobile-first indexing in 2021 so it is important to audit the smartphone readiness of web pages by running several quick tests via the user agent switcher for Google Chrome. This tool allows you to test mobile pages in a desktop browser.

Content should be formatted correctly, have a responsive design and be easily scrollable for users. Videos also need to be compatible and fast loading. Run a final check on pages on an actual smartphone and note down any issues that can be addressed.

Image optimisation

Large, unoptimised images can be a drag on page load times which is not ideal for mobile. Using the correct format is important. JPEG offers the best balance of quality and size while PNG provides the best quality.

The latter is usually better for general mobile browsing. You should also compress your images accordingly. A tool called Screaming Frog is also useful for collating images and finding and amending those with large alt text.

Robots.txt

Robots.txt files are used by search engines to crawl pages and basically tell Google whether or not to index them.

You can use Google Search Console to see whether robots.txt is present on your site and if it is having an impact on performance in SERPs. Robots.txt should generally be set to “disallow: ” to enable user agents to crawl effectively.

Crawl errors

Crawl errors can torpedo even the best SEO campaigns as they make it increasingly difficult for Google to index the right web pages.

Fortunately, Google Search Console makes the task of identifying them relatively easy, just look out for any 400, 500 and not found server errors. Fixing these quickly will allow Google to find your pages.

SSL certification

Google also has a preference for SSL certified sites as they are more likely to provide users with secure and private digital experiences.

If you load up your website and spot a red X in the address bar, then you may have issues with the SSL certificate. Getting a wildcard secure certificate is the solution here as it will ensure https:// resolves itself properly.

Minifying Javascript and CSS

The process of removing redundant data without impacting the experience is known as minification. Minifying your JavaScript and CSS will get rid of bloated code and help to reduce your page load times.

Most websites have one of each of these files. Making sure they are properly coded will eliminate other potential issues too. You can use online tools to see whether any of these files are leading to server bottlenecks and slowdowns.

Aimee


Leave a Reply

Your email address will not be published. Required fields are marked *