SEO

Understanding the SEO Audit for Websites – Part 1

Understanding the SEO Audit for Websites – Part 1

SEO Strategy

If you manage a business with online presence, you will have contemplated the benefits of conducting an SEO audit as recommended by your SEO marketing team. Understanding the main elements of an SEO audit will help you to become familiar with the audit process and allow you to discuss your business requirements with your SEO technician.

Alternatively, knowing which factors are considered within the SEO audit can help you design your website and marketing strategy to implement these features to help your online presence get better results in the search engine results.

Preparing for your SEO Audit

Crawling your Website

Before you start to analyse your website content for search friendly elements, you will need to conduct a site crawl to discover any problems that exist within the site. If you have coding skills, you can create a crawl code or, if not, you can choose from a ready made crawling spider tool.

This will provide the information that a search engine spider will see when crawling your site. Your crawl should be configured to simulate the most popular search bots including Googlebot and Bingbot.

Popular Search Engine User Agents:

Googlebot

Bingbot

Consulting Webmaster Tools

Bing and Google offer diagnostic tools to provide a detailed diagnosis of your website performance. Once your website is registered with the webmasters diagnostic tools for the major search engines, you will be able to access a large amount of data and analytics about how your website is currently ranking with the search engines.

Using your favourite analytics monitor, you can investigate the patterns of your websites visitors and analyse the traffic data.

Search Engine Accessibility

Ensuring that a search engine spider can easily navigate your website content and find unrestricted access to all of the content that you want to be ranked for is the first stage in optimising your website for SEO.

Robots.txt and Meta Tags

Robots.txt files are added to websites for a number of reasons to prevent search engine spiders from accessing all or specific pages of the website. Inclusion of Robots.txt files will prevent content from being indexed on the search engines and must be removed if you require your content to added to search engine results.

Robots meta tags provide information to search engine crawlers about pages that should not be indexed or links that should not be followed. These should be removed from any content that you want to be indexed for search engine results.

URLs Errors Returned

Broken URLs and missing URLs that should be available on your website will return errors such as 404 error. Any links to broken URLs need to be redirected to relevant pages. 301 HTTP redirects offer the best results in producing search engine focus to the required destination.

Sitemaps Checking

Your website will require an XML sitemap which is up to date and is submitted to the search engines to help them navigate your website correctly. A sitemap provides directions for the search engine to enable them to find all the pages on your site. Ensure that all pages that appear on your site crawl appear on the sitemap and that all pages on your sitemap are still located in the correct place on the website.

Helpful Videos:

How to Submit a Sitemap to Google

How to Submit a Sitemap to Bing

Website Structure

The structure of your website defines which pages are placed in order of importance and relevance to the search results that you are trying to achieve. The architecture of your website is composed of content presented in a horizontal and vertical depth. The further into the architecture that a page of content is, the less important is is determined to be.

The number of clicks that a page take to get to from the homepage will provide information about how relevant that content is and how prioritised it is within the website hierarchy.

Javascript and Flash Navigation

If your website uses Javascript or Flash to provide navigation elements you may be blocking the search engine from obtaining data that a user can access. Search engine crawlers are unable to read what is inside a flash or javascript document (although they are getting increasingly more intelligent) and as a result anything on the other side of a flash or javascript barrier is unobtainable.

Compare a site crawl on your website with javascript enabled and one without. If any area is not accessible when javascript has been disabled, you should provide a more accessible navigation.

Performance Optimisation

There are many elements within your website which can be optimised to produce a faster loading time. By ensuring that your website performs fast, you will be able to get the site indexed within the allocated time that a search engine spends per website and you will also hold the attention of impatient users.

Leave a Reply

Your email address will not be published. Required fields are marked *