Digital Marketing Agency

Major Google Update: Reduced Crawl Limits for Googlebot

SEO community, pay attention! Google has just dropped a massive update to its documentation, and if you’re still relying on the “15MB rule,” your content might be at risk of not being indexed.

While everyone is busy reposting the same viral image, I’ve done the deep dive into the Wayback Machine to bring you the actual “Before vs. After.”

Googlebot old crawl limit

Googlebot old crawl limit

🔗 Proof (Archive): Wayback Machine Link – Jan 2026

I’ve done the digging so you don’t have to. Here is the full timeline and the official documentation links that prove how Google’s crawling landscape just shifted.

📍 The “Before” (The 15MB Era)

Since June 2022, Google officially documented a 15MB limit for Googlebot. Many SEOs treated this as the “gold standard” for all content.

🔗 Old Documentation Link (Archive/Reference): Googlebot-15mb

📍 The “Now” (The February 2026 Split)

As of February 5, 2026, Google has reorganized its documentation, creating a massive distinction that could hurt your indexation if you aren’t careful:

Googlebot new crawl limit

Googlebot new crawl limit

  1. Googlebot (Search): Now has a 2MB limit for HTML and supported text-based files. Anything after the first 2MB is NOT forwarded for indexing.
  2. PDF Files: These are the exception, with a limit increased to 64MB.
  3. General Crawlers: The 15MB limit still exists but has been moved to “General Crawler Infrastructure” docs, applying to other Google services (not necessarily Search).

🔗 The New Documentation: Google Search Central – Googlebot

What’s Changing? Historically, Googlebot would crawl the first 15MB of an HTML or text-based file. However, as of February 2026, Google has drastically lowered this threshold for Search:

  • HTML/Text Files: The limit for Googlebot (Search) is now just 2MB. Anything beyond this point in your code will likely be ignored for indexing consideration.
  • PDF Files: Interestingly, the limit for PDFs has been set much higher at 64MB.
  • General Crawlers: The 15MB limit still exists as a baseline for other Google crawlers, but for Search, 2MB is the new ceiling.

Why This Matters for Your SEO Strategy:

  1. Indexation Risk: If your HTML is bloated with inline CSS, heavy JavaScript, or massive DOM trees, your actual content might fall outside the 2MB window.
  2. Critical Content Placement: It is now more important than ever to keep your most important keywords, metadata, and links at the very top of your HTML file.
  3. Code Efficiency: Clean, minified code isn’t just for speed anymore; it’s a necessity for full indexation.

My Take: Google is pushing for a leaner, faster web. This change forces developers and SEOs to stop treating HTML files like “catch-all” buckets for code.

Are your pages exceeding the 2MB limit? Now is the time to check your “Uncompressed Data” sizes in the Search Console! 🔍