Google’s Latest Update, Crawl Limits, and AI Traffic Shifts: What It Means Right Now

Crawl Limits

It’s been a busy week in the search world, with Google rolling out a major update, sharing new details about how its crawler actually works.

Perhaps most interestingly, seeing a sharp rise in traffic coming from its AI tool, Google Gemini is what stands out.

Individually, all these developments matters. Taken together, they reveal how quickly the mechanics of search and discovery, are changing.

The March Core Update Is Underway

Google has kicked off its March 2026 core update, the first broad update of the year. As with most core updates, the aim is simple in theory. To improve the quality of search results by pushing more useful, relevant content higher up.

In practice, though, these updates rarely feel simple. Rankings can fluctuate for days, sometimes weeks, before settling.

Google has said this rollout could take up to two weeks, so any sudden spikes or drops in traffic right now shouldn’t be taken as final.

What makes this update particularly notable is the timing. The last major core update wrapped up at the end of December, leaving a noticeable gap. For many site owners, this is the first real shake-up in search rankings in months.

Google’s advice remains consistent. The advice- don’t rush to conclusions. Wait until the rollout is fully complete, then compare performance with data from before the update began.

Why Rankings Don’t Change All at Once?

One of the more helpful clarifications this week came from John Mueller, who explained that core updates don’t roll out in one clean sweep.

Instead, they happen in layers.

Different systems are updated at different times, which is why rankings often move in waves. A site might drop early in the rollout and recover later-or the other way around. It’s not always a straight line.

Mueller also drew a clear distinction between the recent spam update and the current core update. One targets manipulative content; the other looks more broadly at quality.

Still, the fact that they arrived back-to-back suggests a wider effort to clean up and refine search results overall.

A Closer Look at How Googlebot Works

Alongside the update, Google has offered a rare peek behind the curtain at how its crawler operates. Gary Illyes explained that Googlebot isn’t a standalone system—it’s part of a much larger, shared infrastructure used across multiple Google services.

That detail alone is useful, but what really caught attention is the 2 MB crawl limit.

In simple terms, when Googlebot fetches a page, it only processes the first 2 MB of the HTML response. If the page is larger, it doesn’t keep going—it just stops and treats whatever it has as the full page.

There’s no warning. No second pass. Anything beyond that cutoff doesn’t make it into the index.

Why Page Size Suddenly Feels Like a Big Deal?

This limit might not affect smaller, well-optimized pages. But modern websites are rarely small.

Between high-resolution images, embedded scripts, and complex layouts, page sizes have grown steadily over the years.

And that’s where the risk lies—important content could end up sitting beyond that 2 MB threshold, effectively invisible to search.

External files like CSS and JavaScript don’t count toward the same limit, but anything packed directly into the HTML does. That includes things like inline images or large chunks of code.

It’s the kind of technical detail that’s easy to overlook, until it starts affecting visibility.

The Web Is Getting Heavier

In a related discussion, Martin Splitt pointed out that web pages today are significantly larger than they were a decade ago—nearly three times the size, in fact.

Recent data backs that up, with the average mobile homepage now sitting just above the 2 MB mark. That’s uncomfortably close to Googlebot’s limit.

There’s also an interesting tension here. Google encourages the use of structured data to improve search features, but that same markup adds to page weight. Over time, those small additions can add up.

Google hasn’t said whether the limit will change, but it has hinted that it’s not set in stone.

Gemini Is Quietly Gaining Ground

While much of the focus remains on search, there’s another shift happening in the background—traffic from AI platforms.

According to the more recent data, Google Gemini has more than doubled the amount of traffic it sends to websites in just a period of two months. That surge appears to line up with the rollout of its latest version for users to use.

What is striking is how quickly the gap has closed for everyone. Not long ago, Perplexity AI was well ahead. Now, Gemini has overtaken all AI tools, in referral traffic.

Nevertheless, ChatGPT still dominates this space, accounting for the greatest number of majorities of AI-driven visits. But its lead is no longer as overwhelming as it once was seen by the users.

Still Small But it is Growing Fast

For all the momentum, AI traffic is still a very small piece of the bigger and more complex picture. It currently makes up less than a quarter of a percent of total internet traffic.

That may sound insignificant and not too important, but the growth rate tells a different story. This isn’t a static channel, but it’s expanding quickly, and in many ways that are still hard to predict.

For publishers and marketers, it’s becoming something to watch rather than ignore.

A Moment of Clarity with Some New Questions

What stands out this week isn’t just the updates themselves.

Google is beginning to explain more of what happens behind the scenes. From how much of a page, it actually reads to why updates take time to roll out.

For people working in SEO, this kind of openness is helpful, even if it doesn’t answer everything and, in some cases, raises fresh doubts.

At the same time, the growth of Google Gemini points to a different shift that isn’t as easy to interpret yet. The numbers clearly show it’s sending more traffic, but what that really means for long-term visibility and audience reach is still taking shape.

For now, one thing is clear: search can’t be looked at only through rankings anymore. It’s tied to a much larger system that includes how websites are built, how content is structured, and how people are increasingly discovering information through AI tools.

And all of this is changing at a pace that’s hard to ignore.

Ravi Gupta
Ravi Gupta is the Founder & CEO of  ravi-gupta.com  a leading SEO and digital marketing agency. With over 10…