
AI in search is no longer just about better answers—it’s reshaping how content is discovered, who controls visibility, and where traffic actually goes. This week, Google is pushing deeper into personalization with “Personal Intelligence,” while also revealing how its crawling systems truly work behind the scenes. At the same time, new data highlights a growing reality: search traffic is shifting, not disappearing—and AI isn’t replacing it fast enough to fill the gap.
Google is taking a major step toward more personalized AI experiences with the expansion of Personal Intelligence — a feature that allows its AI systems to use a user’s own data and context to deliver more tailored responses. The rollout brings this capability beyond limited testing into broader availability across AI Mode in Search, the Gemini app, and Gemini in Chrome.
At its core, this update signals a shift from search as a generic information tool to something closer to a personalized assistant that understands your context, preferences, and history.
What “Personal Intelligence” Actually Does
Personal Intelligence allows Google’s AI (powered by Gemini) to connect signals from a user’s Google ecosystem — including apps like Gmail, Google Photos, Search, and YouTube — to generate responses that are more relevant to the individual.
Instead of treating every query as a standalone request, the system can factor in things like:
For example, if you search for travel ideas, the system could suggest destinations aligned with your past trips, preferences, or even photos you’ve taken — reducing the need to repeatedly explain context.
Expanding Beyond Early Access
Initially, Personal Intelligence was limited to AI Pro and AI Ultra subscribers as an opt-in feature. With this expansion, Google is bringing it to more users in the U.S., including broader access within AI Mode and Gemini experiences.
This wider rollout reflects Google’s push to make AI not just smarter, but more useful in everyday tasks — from planning trips to shopping decisions and project organization.
Where It Shows Up
Google is embedding Personal Intelligence across multiple touchpoints:
This cross-product integration means personalization is no longer tied to a single tool — it’s becoming a layer across Google’s entire ecosystem.
Privacy and Control: A Core Part of the Rollout
Given the sensitivity of using personal data, Google emphasizes that Personal Intelligence is opt-in and user-controlled.
Users can:
Google also states that personal data from sources like Gmail or Photos is not used to train the underlying AI models, but rather to provide contextual responses within the user’s session.
Why This Matters: Search Is Becoming Context-Aware
This expansion reflects a broader shift in how search works:
1. From queries to context
Search is moving beyond keyword matching toward understanding who is asking and why — not just what is being asked.
2. From static results to adaptive responses
Instead of showing the same results to everyone, AI can generate responses tailored to each user’s situation.
3. From information retrieval to decision support
Personal Intelligence positions Google not just as a source of answers, but as a system that helps users make decisions faster with less friction.
Implications for SEO and the Web
For SEOs and publishers, this evolution introduces new considerations:
At the same time, this raises ongoing questions around data usage, personalization boundaries, and potential filter effects, as AI systems increasingly shape what users see based on their own history.
The Bigger Picture
Google’s expansion of Personal Intelligence is part of a larger transition from search engines to intelligent systems.
Instead of simply organizing the web, Google is now working toward connecting the web with the user’s personal context — creating an experience that feels less like searching and more like interacting with an assistant that already understands you.
As this rolls out more broadly, it could redefine how users discover information — and how content gets surfaced in an increasingly personalized search environment.
In a recent episode of Search Off the Record, Google’s Martin Splitt and Gary Illyes pull back the curtain on one of the most misunderstood parts of search: Google’s crawling infrastructure. While many developers and SEOs think of “Googlebot” as a single crawler, the reality is far more complex — and much more interesting.
This episode breaks down how crawling actually works inside Google, why the idea of a single “Googlebot” is misleading, and how Google manages crawling at scale without overwhelming websites. If you’ve ever tried to understand crawl behavior from logs or Search Console, this discussion provides a much clearer mental model.
“Googlebot” Is Not a Single Bot
One of the biggest takeaways from the episode is that Googlebot isn’t a single program.
It’s not something like a standalone executable (e.g., “googlebot.exe”) running independently across the web. Instead, Googlebot is better understood as a collection of crawling systems operating within a much larger infrastructure.
Google’s crawling is built more like a centralized platform or service (CaaS — Crawling as a Service) used across multiple Google products — not just Search. This means different systems can request crawling, but the execution is handled by a shared infrastructure.
Crawling Is Centrally Controlled
Another key point is that crawling decisions are not made independently by each “bot.” Instead, they are centrally coordinated.
Google uses a unified system to decide:
This centralized control allows Google to manage crawling at web scale while maintaining consistency across its products.
“Don’t Break the Internet”: How Google Protects Websites
A major theme in the episode is that Google is extremely careful not to overload websites.
Gary and Martin repeatedly emphasize that one of Google’s core principles is essentially:
“Don’t break the internet.”
To achieve this, Google uses several safeguards:
1. Throttling Crawl Rate
Google dynamically adjusts how fast it crawls a site based on how the server responds. If a site slows down or returns errors, Googlebot will automatically reduce its crawl rate.
2. Handling Server Errors (e.g., 503s)
If a server responds with a 503 (Service Unavailable), Google interprets this as a signal to back off. This allows site owners to temporarily reduce crawl pressure without harming long-term indexing.
3. Adaptive Behavior
Crawling is not fixed — it adapts in real time based on:
This ensures Googlebot behaves more like a polite visitor than an aggressive scraper.
Multiple Systems Request Crawling
Because crawling is centralized, multiple Google systems can request URLs to be crawled, including:
These systems feed into the same infrastructure, which then prioritizes and schedules crawling efficiently.
This helps explain why crawl patterns can sometimes feel unpredictable — they are influenced by multiple signals and systems, not just Search alone.
Crawl Behavior Is About Efficiency, Not Exhaustion
Another important clarification is that Google does not try to crawl everything equally.
Instead, it focuses on:
This aligns with what we see in practice: not all pages are crawled at the same frequency, and some may rarely be revisited if they don’t change or attract demand.
Key Takeaways for SEOs and Developers
This episode reinforces several practical insights:
1. Think Beyond “Googlebot” as a Single Entity
Crawling is distributed and centralized — not a single bot hitting your site.
2. Crawl Rate Reflects Site Health
If crawling slows down, it may be due to:
3. Server Responses Matter
Returning proper status codes (like 503 when needed) helps Google adjust behavior correctly.
4. Crawl Budget Is Systemic
Crawl behavior is influenced by:
It’s not just about how many URLs you have — it’s about how efficiently your site can be crawled.
Why This Podcast Matters
For many in SEO, crawling is often treated as a black box. This episode helps clarify that:
Understanding this shifts how you think about crawl optimization. Instead of trying to “force” crawling, the focus should be on making your site easy, fast, and valuable to crawl.
The Bottom Line
Google’s crawling infrastructure is far more sophisticated than the idea of a single “Googlebot.” It’s a coordinated, adaptive system designed to balance discovery, efficiency, and the health of the web itself.
For SEOs and developers, the takeaway is simple:
optimize for clarity, performance, and value — and the crawling systems will follow.
In a wide-ranging interview on the ACCESS podcast, Google Search chief Liz Reid offers a rare look into how Google is thinking about Search during the AI boom. The conversation covers much more than AI Overviews. It touches on how Google sees the future of Search, the difference between Search and Gemini, whether AI agents could become the main users of the web, how Google is dealing with AI-generated slop, and what personalization may look like as AI gets more context-aware.
For anyone trying to understand where Google is headed, this interview gives a useful picture of how the company sees the current moment: not as the end of Search, but as a major expansion of what Search can do.
Liz Reid’s Background Matters
One reason the interview is worth paying attention to is who is speaking. Liz Reid is not a new outside hire brought in to manage a crisis. She is a longtime Googler, with more than two decades at the company, and previously spent many years working on Google Maps before moving to Search.
That background shapes how she talks about product change. She frames Search the same way she framed Maps: as a tool people rely on in everyday life, not just a tech product. Her view is that AI should not be added for novelty. It should help people do things faster, with less friction, and with more confidence.
Google Does Not See AI as the End of Search
One of the most important themes in the interview is that Google does not believe AI is replacing Search in a zero-sum way. Reid argues that the space is actually getting bigger. Her point is that when better tools lower the effort needed to ask a question, people do not just shift one question from one product to another. They often ask more questions overall.
That is a meaningful distinction. Rather than seeing ChatGPT, Claude, Gemini, and Search as one-winner-takes-all products, she suggests the market is still fluid. People are using multiple tools for different reasons, and those habits are still evolving.
In Google’s view, AI is not shrinking curiosity. It is increasing the number of tasks people are willing to offload to software.
AI Overviews Changed User Behavior Faster Than Google Expected
Reid says one of the more surprising things for Google was how quickly users adapted to AI Overviews. She notes that Google usually expects some learning curve whenever it changes Search, especially because people are sensitive to changes in familiar interfaces. But according to her, users picked up AI Overviews faster than expected and started using Search more as a result.
That is an important point because it suggests Google sees AI Overviews not just as a defensive response to ChatGPT, but as a product change that is already influencing behavior at scale.
She also stresses that Google has tried to introduce these changes gradually. Instead of completely replacing the classic Search experience, Google has layered AI features into Search in stages, first through Search Generative Experience testing, then AI Overviews, then AI Mode. That gradual rollout is part of Google’s effort to avoid disrupting a product people depend on every day.
Search and Gemini Are Not the Same Thing
One of the clearest parts of the interview is Reid’s explanation of the difference between Google Search and the Gemini app.
Her view is that the two products share some underlying AI capabilities, but they have different goals:
That means even if the two products sometimes produce similar answers, Google still sees them as serving different use cases. Search is supposed to help users discover information from the broader web, while Gemini leans more toward helping users complete tasks and generate outputs.
At the same time, Reid is honest that the boundaries are not fixed. She says some parts of the two products are converging, while other parts are diverging. Google does not yet know whether they will eventually merge more closely, remain separate, or evolve into something else entirely.
Classic Ranking Still Matters in an AI Search World
Although this interview is more product-focused than technical, one point comes through clearly: Google does not see AI answers as disconnected from traditional Search infrastructure. Reid repeatedly emphasizes reliability, speed, quality, and surfacing the right information. That reflects a broader Google view that AI interfaces still depend on strong retrieval, ranking, and product discipline underneath.
She also points out that Search operates with very tight performance expectations. In her words, users are sensitive even to tiny speed differences. That means Google cannot simply add large language model features if they are too slow or too unreliable. This helps explain why some ideas may have existed internally for years but only became viable once the technology got fast and good enough.
Many “New” AI Ideas Were Not New at All
Another interesting part of the interview is Reid’s point that many of the ideas people now associate with the AI era were already being explored inside Google years ago. The difference is that they often did not work well enough then.
She mentions examples such as:
Her broader point is that recent advances have not created the desire for these features from scratch. Instead, they have made long-imagined ideas finally practical.
Google Thinks Agents Will Be Part of the Web’s Future
The interview also gets into the possibility that agents could become major users of the web. Reid does think there will be a future where agents do a lot of interacting online, not just humans. But she does not believe the internet will become fully agent-to-agent.
Her reasoning is simple: people still want to hear from other people directly. They do not always want an intermediary summarizing or filtering everything for them. So while agents may perform more tasks, gather information, and interact with systems on our behalf, Reid expects a mixed environment rather than a fully automated one.
This matters because it suggests Google is thinking not just about how humans use Search, but how Search might work in a future where software agents also retrieve and process information at scale.
Personalization Is Becoming More Important
One of the more revealing sections of the interview concerns personal intelligence and AI personalization. Reid argues that one of the most important factors in whether AI feels truly useful is whether it “gets you right.”
She describes a future where Google can use the context people deliberately opt into sharing to help solve problems faster. Her examples are practical rather than flashy: planning activities for kids, understanding a user’s preferences, or helping someone make use of benefits or locations they might not think to ask about.
Importantly, she says Google’s approach here is based on consent. Users need to opt in, and Google does not want to assume everyone wants data sources blended together automatically. But the company clearly sees personalized context as one of the ways AI can move from generic answer machine to something more genuinely helpful.
Google Is Worried About AI Slop, But Sees It as an Extension of an Old Problem
Another strong section focuses on AI-generated slop. Reid’s view is nuanced. She does not treat AI use itself as the problem. Instead, she separates using AI as a tool from using AI to mass-produce low-value content.
She points out that Google has been fighting low-quality, repetitive, and manipulative content for a long time. AI has made it easier to create that kind of material faster, but it did not invent the problem. From Google’s perspective, this is an escalation of an existing spam challenge, not a totally new category.
She also argues that AI can help create better content, not just worse content. The real issue is whether the result is genuinely useful, original, and valuable.
That is an important distinction for publishers and creators. Google does not appear to be taking a blanket anti-AI position. It is still focused on quality and usefulness, even if the production process changes.
Google Sees Opportunity in Audio, Video, and Multimodal Understanding
The interview also shows how Google is thinking beyond text. Reid says advances in multimodal AI are making it easier for Google to understand not just transcribed words, but the content and style of audio and video more deeply.
That opens up new possibilities for indexing and surfacing material that has historically been harder to work with, such as podcasts, videos, and other formats that are not purely web-page based. It also supports Google’s effort to keep Search useful even as more valuable content lives behind different media formats or paywalls.
In other words, Google does not seem to see the future of Search as limited to blue links and text documents. It is clearly thinking about how Search evolves in a more multimodal web.
Google Wants to Help Users Reach the Sources They Actually Value
Reid also touches on a theme that is easy to miss but potentially important: Google wants to do more to strengthen the relationship between users and the sources they already trust.
She mentions ideas like:
That suggests Google is aware of how much friction exists today between Search, subscriptions, paywalls, and user trust. It also hints at a more personalized discovery layer where not every source is treated identically for every user.
Google Is Moving Faster Because the Technology Finally Works
A recurring undercurrent in the interview is speed. Reid agrees that Google is moving faster now, but frames that acceleration less as panic and more as the result of technology becoming usable.
Her point is that there have been many moments in Google’s history where teams had good ideas that were too slow, too unreliable, or too hard to scale. What has changed now is that more of those ideas have crossed the threshold into actually working well enough to ship.
That helps explain why Google feels more aggressive and experimental right now. It is not only reacting to competition. It is also responding to a moment where more things are suddenly possible.
The Big Takeaway
The most important takeaway from this interview is that Google does not see AI as something happening outside Search. It sees AI as something that is expanding Search, challenging Search, and forcing Search to evolve all at once.
Liz Reid’s view is not that the open web disappears or that assistants replace everything. It is that the internet is becoming more dynamic, more multimodal, more personalized, and more agent-driven — and Google wants Search to remain the central way people navigate that complexity.
For publishers, SEOs, creators, and marketers, the message is clear: Google is still deeply committed to connecting users with information from the web, but the format of that connection is changing. Search is becoming less about a list of links and more about a layered system of answers, context, preferences, media, and AI assistance.
Readers do not need to listen to the full podcast to understand the main point: Google is not defending the old version of Search. It is trying to redefine Search before someone else does.
Google says it is developing additional controls that would let site owners opt out of generative AI features in Search, a notable shift as pressure from regulators and publishers continues to grow. The development was disclosed in Google’s response to the UK Competition and Markets Authority (CMA), where the company said its goal is to ensure website owners have “the right controls” over how their content is used.
This matters because, until now, publishers have argued that Google’s existing controls are not granular enough. Site owners can limit crawling or snippets in broad ways, and Google also offers Google-Extended for some AI training use cases, but those options do not provide a clean way to stay visible in traditional Search while excluding content from AI Overviews or other generative Search experiences.
Why This Is Happening Now
The immediate trigger is the UK’s regulatory push. Earlier this year, the CMA proposed measures that would give publishers more control over whether their content can be used in Google’s AI-powered Search features, including AI Overviews. Reuters reported that Google’s new position is meant in part to address those competition concerns.
At the same time, publisher pressure has been building more broadly in Europe. Reuters previously reported that European publishers filed antitrust complaints over Google’s AI Overviews, arguing that the current setup forces them into an unfair tradeoff: either allow their content to be used in AI-generated experiences or risk losing visibility in Search.
What Google Appears to Be Working On
Google has not yet published the technical details of how this opt-out would work, but the language matters. The company is not describing a broad deindexing tool. Instead, it is pointing toward further updates to controls so that sites can specifically opt out of generative AI features in Search. That suggests Google is at least exploring a more granular system than the current all-or-nothing choices publishers have criticized.
If implemented as publishers hope, such a control could potentially let a site:
That would be a meaningful change from the current environment, where limiting Google’s use of content often carries broader visibility consequences. This is an inference based on Google’s stated direction and the nature of publisher complaints, not a finalized product specification.
Why Publishers Care So Much
For publishers, the issue is not just attribution. It is traffic, revenue, and control. AI-generated summaries can answer a user’s question directly on Google, reducing the need to click through to the originating site. Critics say that puts publishers in a difficult position: their content may help power the answer, but they may lose the visit. Reuters has reported that publishers in both the UK and EU see this as a serious competition issue, especially when no practical opt-out exists that preserves normal Search visibility.
That concern has only intensified as Google continues expanding AI-driven Search experiences. Google has said AI Overviews and AI Mode are part of the evolution of Search, and that users are asking more complex questions inside these interfaces. For publishers, that makes the control question more urgent, not less.
What This Means for SEO
For SEOs and site owners, this is one of the more important publisher-control stories to watch right now.
If Google introduces a true opt-out for generative Search features, it could create a new strategic decision:
That decision would likely vary by business model. A publisher dependent on ad impressions may weigh the tradeoff differently than a brand focused on awareness or a site that benefits from being cited inside AI results. This is a strategic implication, not something Google has formally stated. It follows from the potential existence of a more granular control.
The Bigger Picture
Google’s acknowledgement that it is developing an opt-out is significant because it shows the company is no longer just defending its current setup. It is publicly signaling that publisher control over generative Search usage may need to become more specific and more flexible.
Nothing has launched yet, and many details remain unknown. But the direction is clear: as AI becomes more deeply embedded into Search, the old content-control tools may no longer be enough. The next phase of Search may depend not only on how Google generates answers, but on how much choice publishers are given over whether their content helps create them.
A new study reported by Axios, based on Chartbeat data across thousands of publisher sites, reveals a major shift in how users are finding content online. The findings highlight a sharp decline in traditional search traffic — especially for smaller publishers — and show that AI chatbots are not yet making up for that loss.
Together, the data paints a clear picture: traffic isn’t disappearing, but the pathways to reach content are changing quickly.
Key Finding #1: Search Traffic Is Dropping — Especially for Small Publishers
The most striking takeaway from the Chartbeat data is the scale of decline in search referrals:
This shows that the impact of shifting search behavior is not evenly distributed. Smaller sites — many of which rely heavily on SEO — are being hit the hardest, while larger brands are more insulated due to direct traffic and stronger audience relationships.
Key Finding #2: AI Chatbots Are Growing — But Still Tiny in Volume
Despite rapid growth in tools like ChatGPT, the study shows that AI-driven traffic is still minimal compared to traditional search.
This creates a gap: while search traffic is declining, the replacement channels are not yet large enough to compensate.
Key Finding #3: Overall Traffic Isn’t Collapsing — It’s Shifting
One important nuance in the data is that total web traffic remains relatively stable, even as search declines.
This suggests:
Publishers with diversified traffic sources — such as direct visits, email, apps, or social — are better positioned to absorb the shift.
Key Finding #4: Larger Publishers Are Adapting Better
The study highlights a structural advantage for larger publishers:
These factors help larger organizations maintain stability, even as search becomes less dominant.
In contrast, smaller publishers that rely primarily on SEO face a more difficult adjustment.
What’s Driving the Decline?
While the study focuses on traffic data, the broader context points to several contributing factors:
These trends align with broader industry observations that search is becoming less click-driven and more answer-driven.
What This Means for SEO and Content Strategy
The implications of this study are significant:
1. SEO Alone Is No Longer Enough
Relying purely on search traffic is becoming riskier, especially for smaller sites.
2. Diversification Is Critical
Publishers need to invest in:
3. AI Traffic Is High Intent — But Low Volume
Even though AI referrals are small, they tend to be more engaged users, suggesting quality over quantity.
4. Discovery Is Fragmenting
Traffic is no longer dominated by one channel — it’s becoming multi-source and distributed.
The Bigger Picture
The Chartbeat study reinforces a key shift in the search ecosystem:
For publishers and SEO professionals, the takeaway is clear:
the future isn’t about optimizing for a single source of traffic — it’s about adapting to a multi-channel, AI-influenced discovery landscape.
In this environment, visibility depends not just on ranking — but on where and how users encounter content in the first place.
Breaking News Thrives in the Age of AI — But Everything Else Is Declining
A new study from Define Media Group reveals a striking shift in how content performs in the age of AI search. While AI features like Google’s AI Overviews are reducing overall organic traffic, breaking news content is not only surviving — it’s growing rapidly, driven largely by Google Discover.
The findings highlight a growing divide in content performance: time-sensitive news is gaining visibility, while evergreen content is steadily losing ground.
Key Finding #1: AI Overviews Are Reducing Organic Search Traffic
Across Define Media Group’s portfolio, the introduction and expansion of AI Overviews has had a measurable impact:
This reinforces a broader trend: as Google surfaces more answers directly in search results, fewer users are clicking through to websites.
Key Finding #2: Breaking News Is Up +103%
Despite the overall decline, one category stands out:
Breaking news benefits from its real-time nature, which makes it difficult for AI systems to summarize quickly and reliably. Because of this, Google continues to prioritize it in formats like the Top Stories carousel, which still drives clicks to publishers.
Key Finding #3: Discover Is Now a Major Traffic Driver
One of the most important shifts in the study is the rise of Google Discover:
Even more importantly:
This confirms that discovery is moving beyond query-based search into feed-based, personalized content distribution.
Key Finding #4: Evergreen Content Is Declining
While breaking news is thriving, evergreen content is moving in the opposite direction:
This suggests that AI Overviews and generative search experiences are directly competing with informational, long-tail content, reducing the need for users to click through to articles.
Why Breaking News Is Being “Protected”
The study suggests two main reasons why breaking news is less affected by AI:
As a result, breaking news continues to appear prominently in formats like Top Stories, which still encourage clicks.
A Shift From Search to Discovery
One of the clearest conclusions from the study is that traffic is shifting from search to discovery-based systems:
For the first time, Discover is no longer a secondary channel — it’s becoming a core traffic driver.
What This Means for SEO and Content Strategy
The implications are significant:
1. Discover Optimization Is Now Critical
Publishers need to treat Discover as its own channel with unique signals and strategies.
2. Content Mix Matters More Than Ever
3. Zero-Click Behavior Will Likely Increase
As AI Overviews expand, more informational queries may be answered directly in search.
4. Diversification Is No Longer Optional
Relying solely on traditional search traffic is becoming increasingly risky.
The Bigger Picture
This study reinforces a broader shift in the search ecosystem:
The takeaway is clear:
visibility is no longer just about ranking — it’s about where and how content is surfaced across Google’s ecosystem.
In an AI-driven environment, timeliness, relevance, and distribution channel may matter just as much as the content itself.
Taken together, this week’s updates point to a search landscape that’s becoming more fragmented, more personalized, and more complex to navigate. From evolving crawl systems to AI-driven interfaces and shifting traffic patterns, the rules of visibility are changing. The challenge now isn’t just ranking—it’s understanding how your content fits into an ecosystem where discovery happens across search, feeds, and AI-generated experiences.