SEOIntel Weekly News Round-up (First Week of February 2026)

This week’s updates point to a quiet but important shift in how Google surfaces and evaluates content — especially beyond traditional search results. With the rollout of the first-ever Discover Core Update, Google is signaling that Discover is no longer just an add-on feed, but a system with its own evolving quality standards. Combined with […]
Marie Aquino
February 6, 2026

This week’s updates point to a quiet but important shift in how Google surfaces and evaluates content — especially beyond traditional search results. With the rollout of the first-ever Discover Core Update, Google is signaling that Discover is no longer just an add-on feed, but a system with its own evolving quality standards. Combined with new insights into crawling challenges and Google’s framing of AI search as an expansion rather than a replacement, this week’s news offers a clearer look at where visibility and discovery are heading.

Google Rolls Out February 2026 Discover Core Update — A First of Its Kind

Google has officially launched the February 2026 Discover Core Update, marking the first core algorithm change focused solely on Google Discover — the personalized content feed that users see in the Google app and on mobile devices. This update kicked off on February 5, 2026 and is currently rolling out to English-language users in the United States, with plans to expand globally over the coming weeks.

What Is a Discover Core Update?

Unlike traditional Search core updates that broadly affect organic ranking signals for query results, this update is tailored specifically to Discover’s recommendation systems — the machine-learning systems that decide which content to show each user without an explicit search query.

Google describes core updates as significant algorithmic improvements designed to better surface quality content. Core updates don’t penalize websites but rather reassess how content quality, relevance, and user value are interpreted across systems.

Key Aims of the Update

Industry reporting and expert commentary suggest the February 2026 Discover Core Update centers on three major priorities:

  • Boosting locally relevant content — content from sites based in a user’s country or region may be more prominent in feeds.
  • Demoting sensational and clickbait material — headlines and posts designed primarily to lure clicks may be less favored.
  • Rewarding topical expertise and depth — original, in-depth reporting and authoritatively presented content is more likely to resonate with Discover’s systems.

This focus aligns with broader quality signals like E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) already emphasized in Search quality documentation — suggesting Discover is increasingly governed by similar foundational criteria.

Why This Matters

The Discover feed can be a significant source of referral traffic for publishers, especially for news and timely content. A dedicated core update signals that Google considers Discover a distinct and strategically important platform — not just an offshoot of Search.

Because the update is Discover-specific, ranking changes here may not directly correlate with traditional Search visibility, meaning sites can see fluctuations in Discover traffic independent of Search performance.

What Publishers and SEOs Should Watch

Even in its early rollout phase, this update highlights several key takeaways for content teams:

  • Local relevance matters — optimization for regional audiences, location-specific reporting, and localized content hubs may perform better in Discover feeds.
  • Avoid clickbait tactics — simplified “shock” headlines or sensational summaries may not fare well under updated quality signals.
  • Deep, original content wins — developing authoritative, well-researched topic clusters supports not just Search ranking but feed relevance.

The rollout is expected to take up to two weeks in the U.S. before expanding globally, so publishers with significant Discover traffic should monitor Discovery reports in Search Console and analytics closely over the coming days.


What Google Learned From a Year of Crawling the Web

In a recent Search Off the Record episode, Google’s Martin Splitt and Gary Illyes walked through the 2025 Year-End crawling report, highlighting the biggest obstacles Googlebot faced last year. What they describe isn’t abstract SEO theory — it’s real-world crawling behavior that impacts how efficiently Google can discover and index content across the web.

Although the full episode goes into deep technical nuance, the core takeaways point to a handful of persistent issues that make crawling harder for Googlebot — and tougher for site owners who want consistent visibility.

What Google Found Was Tripping Up Crawlers

According to the data shared in the report and summarized by industry coverage:

  1. Faceted Navigation (≈50%)
    E-commerce and filter-heavy sites create huge sets of URLs when users can sort or filter products (price, size, color, etc.). These combinations generate vast URL spaces that Googlebot often tries to crawl exhaustively — even when many of these pages aren’t useful to searchers.
  2. Action Parameters (≈25%)
    Parameters that trigger actions — like adding something to a cart — don’t change the content in ways Googlebot needs to index, but they still produce unique URLs. These often stem from plugins or default CMS behaviors, multiplying crawl paths without adding value.
  3. Irrelevant Parameters (≈10%)
    Things like session IDs, UTM tags, or other tracking strings don’t meaningfully alter a page’s content, but the crawler can’t always tell that up front. So Googlebot may crawl repetitively to test whether these strings matter.
  4. WordPress Plugins and Widgets (≈5%)
    Certain open-source plugins can introduce unusual structures or parameters that expand the crawling surface without meaningful content changes — another drain on crawl efficiency.
  5. Other “Weird Stuff” (≈2%)
    Rare technical misconfigurations — like double-encoded URLs — also showed up as cases where Googlebot struggles to determine canonical content or whether a URL even renders properly.

These findings aren’t just numbers: they reflect real patterns where crawlers spend unnecessary time and resources, potentially slowing down discovery of your core pages and increasing server load.

Why Crawl Efficiency Still Matters

Even as Search evolves with AI and machine learning, crawl efficiency remains foundational. If Googlebot can’t reliably reach and interpret your site’s valuable content, that content may never get the attention it deserves in Search. Issues like runaway crawl loops, redundant parameters, and massive parameter-generated URL surfaces can waste bot resources and shake loose bandwidth you’d rather reserve for your strongest pages.

For SEOs and site owners, these insights reaffirm long-standing best practices:

  • Use canonical URLs and parameter handling rules where appropriate
  • Limit faceted navigation URLs that don’t serve unique content
  • Audit plugins and features that generate needless URL variations
  • Monitor crawling behavior via logs and Google Search Console

In a year where crawling was anything but simple, these challenges highlight continued opportunities to make sites easier to crawl, more predictable to index, and more rewarding for users and search systems alike.


Google Says AI Search Is Entering an “Expansionary Moment” — Not a Collapse of Usage

Google executives framed the current phase of AI-powered search as an “expansionary moment”, where artificial intelligence isn’t replacing traditional search activity but adding to it, driving richer, more engaged interactions across more modalities and query styles. This concept emerged from commentary tied to Alphabet’s latest earnings and public statements about how AI features are reshaping search behavior and usage patterns.

Executives like Sundar Pichai emphasized on Google’s Q4 2025 earnings call that AI-driven experiences are boosting overall search usage, rather than cannibalizing it. Pichai noted that as users engage with generative features, they tend to ask more questions and explore more deeply — a signal that AI features are expanding the volume and depth of interactions within Search.

This idea of expansion goes against early predictions that AI responses might reduce search volumes by answering queries directly without return visits. Instead, Google reports higher engagement and more complex query patterns, driven by longer sessions, increased follow-ups, and broader use of voice and image search — all infused with AI.

Contextually, this reflects Google’s ongoing integration of AI Overviews and AI Mode into Search. These features let users ask conversational questions and keep exploring related topics within the same interaction — effectively stretching search sessions instead of shrinking them.

Industry analysts note that richer AI interactions also influence monetization and revenue dynamics. AI-powered queries tend to be longer and more complex, opening up new surfaces for ads and potential commerce integrations like pilot “Direct Offers” in AI Mode.

Despite some concerns from publishers about zero-click results or reduced click-through to traditional links, executives — including Google’s Robby Stein — have explicitly described this as an expansion rather than contraction of how search functions, linking deeper AI engagement with increased overall activity.

In practical terms, “expansion” means:

  • Users are asking more follow-up questions.
  • AI experiences are lengthening sessions.
  • Voice and multimodal queries (like images) are rising.
  • Advertisers and revenue channels are evolving alongside these experiences.

For SEOs and publishers, this frame matters: it suggests that search visibility still exists within a richer engagement ecosystem, but the ways people interact with content are broadening. Visibility may now be measured not just in clicks, but in engagement depth, session length, and presence inside AI-guided pathways.


Taken together, these updates reinforce that search visibility is no longer confined to classic rankings alone. Discover is being refined with its own quality signals, crawl efficiency remains foundational, and AI-driven experiences are reshaping how users interact with information. For site owners, this means paying closer attention to how content is discovered, crawled, and recommended — not just how it ranks. Understanding these layers will be key as Google continues to diversify how content reaches users.