TheMarketingblog

Surviving Algorithm Changes With a Technical SEO Audit


Search engine algorithms change more often than you think. You may not notice them, as most changes are minor and rarely require the affected websites to take decisive action. The few that search engines like Google announce, however, impact websites’ visibility in search results, either for better or worse.

It’s hard to nail down an exact number, but Search Engine Land’s estimate suggests that Google’s algorithm undergoes approximately 13 changes per day. This is based on 4,725 updates performed in 2022, 10 of which were major ones. So far this year, Google has only received three major updates (in March, June, and August).

Maintaining search visibility involves keeping up with these changes, whether or not they’re publicly known. If your content isn’t getting as much traffic as you believe it should, it might be time for a thorough website audit, starting with its technical aspects.

Technical SEO Explained

Every part of a web page contributes to its ranking in search engine results, including the backend or the parts that aren’t visible to a visitor. While these elements don’t pull as much weight as on- and off-page content, optimising them for search still helps.

This is the realm of technical search engine optimisation or technical SEO. It’s concerned with improving the website’s infrastructure to allow search engines to crawl and index its content efficiently. Think of it like this: when a recipe calls for filleting or butterflying a fish, technical SEO points out the right knife for the job.

Although one of the three pillars of SEO, technical SEO is often treated as an afterthought. It’s worth noting that all well-written content on the Web won’t matter if the website itself is plagued with performance issues. These include but aren’t limited to:

  • Poorly written codes and scripts
  • Slow server-side response times
  • Improperly scaled website assets
  • Too many client-side requests

Such a treatment is also reflected in the fact that brands often assign technical SEO tasks to those without training. It’s called “technical” for a reason, and it involves a lot of coding and web development.

If the right talent is in short supply, working with a dedicated SEO service provider such as Smartly Done is a viable alternative. Their experience in web development, quality hosting, and custom programming is reflected in their work, not just in developing a search-friendly website, but also in optimising existing ones.

The Complete Checklist

A thorough audit begins with knowing the key elements, and there are around eight of them for technical SEO. Here’s a deeper delve into each.

Crawlability

Search engines rank content by sending out crawlers, better known as bots or spiders, to the pages they find. Not only do they crawl the page assigned to them but also other links present on that page, which then crawl links in those links, and so on. And even then, it’s estimated that crawlers only cover up to 70% of the Internet.

For crawlers to best crawl a site, it must consist of three elements.

  • An XML sitemap that provides a list of crawlable web pages
  • Internal links that direct crawlers to other crawlable web pages
  • A site structure where pages are as few clicks from the homepage as possible

Even with all three present, some problems may still arise. Internal links may be broken due to the target page being deleted or moved, mistakes in URL spelling, or even server-related issues. Some pages may not contain any internal links, though you can get away with placing the link in the navigation menu or XML sitemap.

Another common issue is errors in the robots.txt file, a text document that tells crawlers which pages to crawl and which not to. Besides typos, the errors may include web pages mistakenly included in the crawl or no-crawl lists.

Indexability

Once the crawling is done, the crawlers send the data back to the search engine to start “indexing” or including the pages in search results. That said, note that crawling doesn’t guarantee that the page will rank.

Naturally, pages that go against the search engine’s content quality guidelines won’t be indexed. However, pages can also not be indexed for other reasons, such as the type of content. For instance, legal and privacy statements exist to fulfil legal requirements and typically make little impact on a website’s SEO strategy.

Some issues are also caused by conflicting orders, such as a page containing both follow and nofollow meta tags or a noindex meta tag on a crawlable page. In both cases, search engines prioritise the more restrictive conditions – meaning they’ll interpret nofollow and noindex all the time. Marketers are known not to update these tags when migrating pages.

Site Structure

While crawlers aren’t exactly customers, they benefit from a website with a streamlined website structure. Just as human visitors can easily find what they’re looking for, crawlers can easily crawl on every page on the site.

A common piece of advice is the so-called “three-click rule,” which states that any page should be no more than three clicks away from a home or landing page. While it sounds sensible, no study has yet supported this claim. Additionally, experts emphasise that the number of clicks isn’t a good way to measure a website’s ease of use.

Building a search-friendly structure requires four elements.

  • Shallow/flat architecture: Regardless of the type of structure, it pays to keep your page as close to the surface as possible. Three clicks aren’t necessarily a bad idea, but ensure that everything can be realistically done within that number.
  • Consistent URL structure: Form URLs in a specific structure, primarily following a path (e.g., category/topic/page). Experts don’t recommend structures that use URL parameters, as they aren’t user-friendly.
  • Contextual internal linking: Plan internal links in a way that helps visitors better understand the content’s context. Choose pages that support the information on the primary page and vice versa.
  • Scalable navigation: Design the website’s navigation to incorporate future content (e.g., new product types, new blog post categories). Maintain consistency between the header and footer navigation menus.

Core Web Vitals

Google’s Core Web Vitals are three metrics that measure a website’s user experience. It measures three aspects – load speed, interactivity, and visual stability – all represented by:

MetricDescriptionCriteria
GoodN.I.Poor
Largest Contentful Paint
(Load speed)
The time taken for the main content to load< 2.5 sec2.6 – 4.0 sec> 4.0 sec
Interaction to Next Paint
(Interactivity)
The time taken for a page’s asset to respond to a click< 200 ms201 – 500 ms> 500 ms
Cumulative Layout Shift
(Visual stability)
The total number of shifts a page makes while loading< 0.10.11 – 0.25> 0.25

As a metric developed by Google, Core Web Vitals are undoubtedly a search ranking factor. Not as much as quality content and links (as John Mueller stated that a more relevant site will still outrank a faster-loading one), but a ranking factor, nonetheless.

You can get a report of all three metrics and more using the Google Search Console. Other Google tools that also offer this function include PageSpeed Insights and Lighthouse. Just improving your website’s vitals to N.I. is enough to induce ranking changes.

Mobile-First

A study in June by the Institute of Practitioners in Advertising revealed that mobile use has finally overtaken TV, with the average British adult spending 3.35 hours on mobile devices (compared to 3.27 hours watching TV). Half of the time is spent on browsing social media, with audio and video coming second and third, respectively.

Not having a mobile website—or, at least, a mobile-friendly one—is essentially inviting the Internet to overlook you. Google had the foresight that mobile would dominate desktop in the next few years, prompting them to introduce mobile-first indexing. Simply put, Google uses your website’s mobile version to determine its ranking.

Responsive web design is often the first to come to mind, but it’s by no means the only one that matters. Other equally important approaches include:

  • Adequate spacing for touchable assets
  • Readable fonts and font sizes (no zoom required)
  • A layout that minimises horizontal scrolling
  • A design free of intrusive interstitials (pop-up ads)

Site Security

Nothing dissuades visitors more from a website than one that’s been involved in a recent data breach. But equally discouraging is one whose URL doesn’t start with “https.”

The HTTPS protocol (or HTTP Secure) signifies that a website encrypts any data it sends between itself and the user’s web browser. Doing so prevents hackers from stealing your data throughout this process or, if they do, from being able to make sense of it. Most browsers, especially Google Chrome, warn users if a website doesn’t have HTTPS enabled.

The HTTPS protocol is confirmed to be a ranking factor. Even if it wasn’t, securing your website this way is still a good idea for bringing peace of mind to visitors. Of course, it’s no substitute for general cybersecurity.

Structured Data

Structured data (sometimes called schema markup) enables search engines to interpret certain types of content as intended. For example, a recipe for shepherd’s pie may show up in searches for “shepherd’s pie” in a recipe format. Such results are referred to as “rich results,” and they enhance the user’s search experience.

Google recognises 27 types of structured data as of this writing. Your content doesn’t need to be compatible with all of them, as some types work better than others. Recipe content benefits most from a carousel, which organises similar content into a series of cards that users can scroll through (especially on mobile).

Some content may use more than one structured data type. Carousels, for instance, must be combined with one of four types: course list, movie, recipe, or restaurant.

To test if a page supports rich results, copy-paste the page URL or a portion of its code into Google’s Rich Results Test. The tool then collects and analyses valid items and any issues it finds. Note that this only works on publicly accessible pages; those not accessible to the public or behind a firewall may need more advanced debugging.

Canonicalisation

Contrary to popular belief, Google doesn’t penalise duplicate content unless it was meant to game the algorithm. Still, having multiple copies of one page can risk confusing search engines as to which version to rank. If it does, all the copies may be pushed several ranks down or, worse, removed from search results.

If you want a page to be marked as the original, you’d like to put canonical tags on it. That way, the search engine can safely focus its crawling and indexing efforts on the canonical or preferred page and consolidate signals from dupes. This is crucial for e-commerce sites, where product pages can generate dupes due to item filters (e.g., colour, size).

Canonical tags aren’t foolproof, especially if the search engine detects inconsistencies in the website’s signals. For example, suppose the canonical tag is in an extension that differs from the one the website usually uses. In that case, the search engine may opt to ignore that tag in favour of a dupe with the standard extension.

Another method, which sends a stronger signal to search engines, is a 301 redirect. A page with this feature automatically redirects traffic to another page, implying that the former no longer exists, has been moved to a new URL, or its content is no longer applicable.

Technical SEO Amid AI Search

With AI summaries and AI-powered search rapidly becoming the norm, there’s talk about whether technical SEO – and SEO, in general – will remain relevant. While major changes are inevitable and some elements may be less prioritised, technical SEO will continue to have a significant impact on at least four elements.

  • Structured data: AI tools love structured data because it helps retrieve information more efficiently. The labels allow them to formulate their answers clearly.
  • Website structure: AI tools pull data from billions of pages at a time, which means they neither have the time nor the patience to wait for yours to load fully.
  • Core Web Vitals: It’s unlikely for Google to leave out metrics it developed, even as it incorporates more AI into its search engine.
  • Site security: Protecting a website is a matter of fostering trust, something that’ll only become more important in an AI-dominated era.

A Technical SEO Audit May Be In Order

If you can’t remember the last time your website underwent a technical SEO audit, it may be time to do one. Guideline updates may have rendered your last batch of optimisations obsolete or even outlawed. An audit is also a good idea after a website overhaul.