How to Use SEO Tools to Diagnose Your March 2026 Google Update Impact

Knowing an update has occurred is only the beginning. What follows is the harder work: using the right tools, in the right order, to pinpoint exactly what changed on your site and why. Here is our expert tool-by-tool checklist for the March 2026 Core and Spam Updates.

In this post, we'll explore:

  • Before you start — set the right baseline
  • Step 1 — Using Google Search Console
  • Step 2 — Using Google Analytics 4 
  • Step 3 — Using Semrush or Ahrefs 
  • Step 4 — Using Screaming Frog 
  • Step 5 — Using Google PageSpeed Insights & Core Web Vitals
  • Your master diagnosis checklist


When Google releases a core or spam update, the instinct for many website owners is to immediately start making changes. This is almost always the wrong move. Before you rewrite a single page or disavow a single link, you need a clear, evidence-based picture of what actually happened, which pages were affected, by which update, and why.

At The Digital Stride, we follow a structured process for every post-update audit, which we will explain in detail below. 

Before you start - set the right baseline

Effective diagnosis requires a clean comparison period before either update touched your site, and it is important to treat the spam and core updates as two separate events.

The March 2026 spam update ran from 24 March to 25 March and is now fully resolved, which means you are working with a complete data set for that update. Use 21–23 March as your baseline and compare traffic and rankings from 24–25 March against that window.

The March 2026 core update began on 27 March and is still rolling out, with Google indicating it may take up to two weeks to complete. Use 24–26 March as your main baseline, as this sits after the spam update finished but before the core update began, giving you a clean snapshot of performance before things started to shift.

It’s also worth looking at a slightly longer period before 27 March (around 7–14 days) to get a better sense of typical performance and smooth out any day-to-day fluctuations. You can then compare everything from 27 March onwards against these benchmarks.

Avoid drawing firm conclusions for now, as rankings are likely to keep fluctuating until the update has fully rolled out.

Log every observation with dates as you go, and resist making any changes until you have worked through all five steps below.

Step 1 — Google Search Console

Google Search Console is always your first port of call. It provides first-party data directly from Google (the most authoritative source available), and it is free.

First, check the Manual Actions report (Security & Manual Actions → Manual Actions). If a manual action is present, it will name the violation directly. Fixing it is your absolute priority before anything else. Spam-related manual actions typically cite scaled content abuse, link spam, or site reputation abuse.

Next, run two date comparisons in the Performance report:

  • Comparison A: 24–25 March vs 22–23 March. A drop here points to the spam update.
  • Comparison B: 27 March onwards vs 24–26 March. A drop here points to the core update.

Sort by Clicks under the Pages tab and identify your top 20 highest-traffic pages. Check Impressions alongside Clicks: if both dropped, you lost ranking positions; if only CTR dropped, your pages still rank but are being skipped. Switch to the Queries tab to identify which keyword clusters took the biggest hits.

Finally, check the Indexing report for any sudden increase in pages marked "Crawled — currently not indexed," and review Core Web Vitals under Experience for pages flagged as Poor.

Step 2 — Google Analytics 4

While GSC tells you what Google thinks of your pages, GA4 tells you how real users behave once they arrive - critical context for diagnosing the core update.

  • Traffic by landing page: Reports → Engagement → Landing page. Filter to Organic traffic only and identify which pages saw the biggest session drops.
  • Engagement rate: Pages below 40% are underperforming and are prime targets for the core update.
  • Average engagement time: Pages where users leave within 30 seconds signal low satisfaction to Google.
  • Cross-reference with GSC: Pages with high impressions but poor engagement metrics are a core update red flag.

If the pages with the worst GA4 engagement are exactly the pages that lost rankings in GSC, you have strong evidence that content quality is the culprit, not spam signals.

Step 3 - Semrush or Ahrefs

Once you have your first-party data, third-party platforms provide the competitive intelligence and backlink analysis that GSC and GA4 cannot. Both tools are excellent; the workflows below apply to either.

Ranking and visibility checks: Run a keyword position report comparing 21 March vs 28 March, filtering for keywords that dropped more than five positions. Identify which page types lost most blog posts, product pages, and landing pages as a pattern points to the content type Google is re-evaluating. For your top lost keywords, check what now ranks in positions 1–3 and study those pages carefully.

Backlink audit (spam update focus): Run a backlink audit and filter for toxic domains or low-quality links. Look for unnatural anchor text patterns; a high proportion of exact-match commercial anchors is a link spam signal. Check your backlink history for sudden spikes from low-quality domains around the update date. Compile a disavow file for clearly toxic links, but do not submit it yet.

A quick tip: Semrush Sensor tracks SERP volatility by industry in real time. Check your sector's volatility score for the update windows to understand whether your industry was disproportionately affected.

Learn how The Digital Stride uses Semrush’s AI Tools to Improve SEO and AI.

Step 4 - Screaming Frog SEO Spider

Screaming Frog crawls your site as a search engine would, surfacing technical issues that compound the impact of both updates.

Run a full site crawl and prioritise the pages GSC flagged as losing traffic. Then:

  • Filter by word count, as pages under 300 words that receive organic traffic are thin content candidates.
  • Check for duplicate or near-duplicate content, which dilutes topical authority.
  • Review canonical tags for any pages pointing to a different URL, which may exclude them from ranking.
  • Check crawl depth, as pages more than three clicks from the homepage receive limited crawl budget.
  • Identify orphaned pages with no internal links, which are effectively invisible to crawlers.

Connect Screaming Frog directly to your GSC account via Configuration → API Access to overlay traffic data onto your crawl results. This immediately shows you which technically problematic pages are also your biggest traffic losers.

Step 5 - Google PageSpeed Insights & Core Web Vitals

Core Web Vitals are confirmed ranking signals. Sites with unresolved CWV failures, particularly on mobile, were disproportionately affected in March 2026. Test your top ten organic landing pages individually, always on mobile first.

  • LCP (Largest Contentful Paint): target under 2.5 seconds.
  • INP (Interaction to Next Paint): target under 200ms. Heavy JavaScript is the most common cause.
  • CLS (Cumulative Layout Shift): target below 0.1. Common causes include images without defined dimensions and late-loading ads.

Cross-reference results with the Core Web Vitals report in GSC; any URLs listed as Poor are your highest-priority technical fixes.


Once the diagnosis is complete

You should now have a prioritised list of affected pages, a clear verdict on which update caused each drop, and specific issues to address. Only at this point should you begin making changes, starting with your highest-traffic pages first.

Get in touch with the Digital Stride team if you need expert support turning this diagnosis into a recovery plan.

👉 Visit our SEO Services page

👉 Call us or Enquire Now to get started

Published:

6 Apr 2026

Author:

  • David Jolly