SEOIntel Weekly News Round-up (Last Week of August 2025)

This week’s SEO news spans everything from fresh algorithm updates to AI’s ongoing role in shaping discovery. Google has rolled out its August 2025 Spam Update, putting low-quality and manipulative tactics back under the microscope. At the same time, new research shows that while AI tool adoption is growing, traditional search remains the dominant way […]
Marie Aquino
August 29, 2025

This week’s SEO news spans everything from fresh algorithm updates to AI’s ongoing role in shaping discovery. Google has rolled out its August 2025 Spam Update, putting low-quality and manipulative tactics back under the microscope. At the same time, new research shows that while AI tool adoption is growing, traditional search remains the dominant way people find information.

We also learned that ChatGPT leans on Google Search data via SerpApi, underscoring Google’s continuing centrality even in an AI-driven world. Add in Google’s latest tweaks to AI Mode designed to encourage more clicks, new guidance for JavaScript-based paywalls, and a recently resolved crawl-rate bug in Search Console, and it’s clear there’s a lot to stay on top of.

Google’s August 2025 Spam Update: What You Need to Know

On August 26, 2025, Google launched its first spam update of the year—a global rollout targeting content that violates its spam policies. The update is expected to unfold over several weeks and affects all languages and regions.

Context: The Spam Update Timeline

This marks Google’s first spam-related algorithm change since December 2024, which rolled out from December 19 to 26 and caused more volatility than June 2024’s update. Here’s a snapshot of recent spam update rollouts:

  • December 2024: ~7-day rollout with notable ranking shifts
  • June 2024: Typical duration of one week
  • March 2024: ~15-day rollout
  • October 2023: ~15-day rollout
  • Earlier spam updates included December 2022 (link-specific), October 2023, and others going back years. These updates reflect Google’s evolving effort to clamp down on manipulative tactics.

What This Means for Your Site

This is what you need to know:

  • If your site engages in spammy practices—like thin or duplicated content, manipulative redirects, or cloaking—you may see ranking declines.
  • Google advises website owners to review and align with its spam policies; improvements take months for automated systems to recognize and restore ranking.
  • Unlike link-specific updates, this August release targets a broader set of spam policy violations, excluding link spam and reputation abuse.

Historical Impact: What Has Happened in The Past?

Spam updates often spike volatility for affected sites. For instance:

  • The December 2024 update affected some sites that had surged in the previous core update only to fall again.
  • The link spam update of December 2022 neutralized manipulative links, leading to long-term ranking drops for many sites.

What SEO Professionals Should Do This Week

  1. Audit your content for spam patterns—thin pages, PAA stuffing, doorway pages, or automation—then clean them up.
  2. Revisit your site’s compliance with Google’s spam policy guidelines, and patch anything that looks manipulative or low-value.
  3. Monitor Search Console closely, especially impressions, clicks, and ranking movements for only the next few weeks.
  4. Hold off on large-scale changes unless your diagnostics indicate a direct spam violation. Premature changes can delay recovery.
  5. Focus on quality—create helpful, original, expert content with good structure and usability. This is the most durable defense against spam updates.

Summary: Why It Matters

This ongoing spam update signals a renewed emphasis on spam prevention in Google Search. Even if your site appears untouched, it’s a timely prompt to double down on best practices and audit existing content. The best insurance: affirm your site’s quality and guard against bad content before it costs you visibility.


AI Tools Adoption Grows, But Search Still Dominates

SparkToro’s latest study from Rand Fishkin (using Datos’ multi-million-device clickstream panel) reports that over 20% of Americans are now “heavy” AI users—i.e., they use tools like ChatGPT, Claude, Copilot, Gemini, Perplexity, or Deepseek 10+ times per month. At the same time, traditional search remains almost universal: ~95% of Americans use search engines monthly and ~86% are “heavy” search users, underscoring that search isn’t being displaced even as AI tools rise.

Adoption… and a slowdown

Heavy AI usage has grown from ~3% in early 2023 to ~21% by June 2025, but the growth rate is decelerating; the combined AI tools group hasn’t posted a 1.1×+ growth month since September 2024. SparkToro’s read: we may be approaching a plateau unless AI providers can broaden appeal beyond early adopters (especially knowledge workers).

Search still dominates (and even edges up)

Search isn’t just steady—it shows signals of slight growth in heavy usage (e.g., heavy Google users rising from 84% → 87% 2023–2025) and mostly stable overall visits per searcher (aside from a summer dip). SparkToro also points to outside data suggesting ChatGPT adopters often increase their Google searches, implying AI is acting as a complement, not a replacement.

Methodology notes (read before you generalize)

The study aggregates two baskets: traditional search (Google, Bing, Yahoo, DuckDuckGo) vs AI tools (ChatGPT, Claude, Copilot, Gemini, Perplexity, Deepseek). It’s US-only, desktop-centric (mobile apps excluded), and covers Jan 2023–mid-2025—which is crucial context when applying these trends to your audience or channel mix.

What this means for SEO

  • AI isn’t “killing” search. SparkToro’s numbers support a coexistence model: AI tools are rising fast, but search remains the default habit for nearly everyone. Treat AI as adjacent demand—not a zero-sum rival.
  • Zero-click pressure is real, distribution shifts are ongoing. Even with stable search usage, more answers on the SERP (and in AI layers) can compress clicks to publishers. Visibility now includes being cited and surfaced inside AI experiences, not just ranked.
  • Audience nuance matters. SparkToro explicitly recommends analyzing your own audience’s AI adoption—what’s true in aggregate may differ by vertical (finance, developer tools, education, etc.).

Action checklist for marketers & SEOs

  1. Design for extraction and citation. Put concise, verifiable answers near the top; use clear H2/H3s, FAQs, tables, and steps to be quotable in AI and helpful in search.
  2. Publish “click-worthy” assets. Create things summaries can’t replace—interactive tools, proprietary data, benchmarks, calculators, original reporting, and opinionated POVs.
  3. Segment by AI-sensitive queries. Flag topics likely to trigger AI layers; track impressions, CTR shifts, and assisted conversions to judge real business impact.
  4. Build owned demand. Convert AI/search exposure into email, community, and app subscribers so you’re less dependent on any single referrer.
  5. Tune for vertical surfaces. If your industry shows up in Google’s vertical experiences (Finance, Flights, Shopping), ensure structured data and task-oriented UX that win inclusion and link-outs.
  6. Market to AI users (not just through AI). For the 20% heavy users, provide promptable summaries, downloadable templates, and datasets they can feed into their tools—then invite them to your deeper asset.
  7. Mind the limits of the data. Desktop-only, US-only trends are directionally useful; validate against your own logs, Search Console, and analytics before making big budget shifts.

Bottom line

SparkToro (Rand Fishkin) finds AI tool use is widespread and growing, but slowing, while traditional search remains the near-universal behavior in the US. The practical play is not to abandon SEO—or to chase AI at the expense of search—but to optimize for both realities: be the source AI wants to cite and the site humans want to visit.


ChatGPT Relies on Google Search (via SerpApi)

Recent reports confirm that ChatGPT is using Google Search results scraped through SerpApi, an independent web-scraping service, to provide real-time answers to fast-moving questions like sports scores, financial updates, and news. This method supplements ChatGPT’s internal index and partnerships (like Bing and publishers), especially for dynamic content where freshness is crucial.

Why It Matters

Despite positioning itself as a rival to Google Search, ChatGPT’s dependency reveals it still needs Google’s unrivaled index to answer time-sensitive queries. Note that ChatGPT’s reliance is not just on Google’s crawling infrastructure—but its constantly refreshed search data.

Ethical and Strategic Fallout

This setup raises several concerns:

  • Ethical/Legal: Using Google’s data through scraping rather than authorized access is contentious.
  • Competitive Irony: ChatGPT leverages the infrastructure of the very company it seeks to compete with.
  • Operational Risk: If Google blocks SerpApi or clamps down on scraping, ChatGPT’s real-time capabilities could suffer.

What This Means for SEO

Here’s how content creators and SEOs can turn this insight into advantage:

  1. Keep content visible in SERPs. Since Google’s index influences even AI tools like ChatGPT, strong rankings still matter.
  2. Think about AI citation, not just ranking. Clear summaries and structured content (FAQs, step lists, schema) make your content more likely to be quoted or referenced.
  3. Track cross-platform visibility. If your content appears in ChatGPT responses, even without SERP rank, that’s significant authority. Look for AI citations as a signal.
  4. Monitor your content’s reach via scraping tools. Understand how your pages display when fetched by services like SerpApi—this can inform presentation and metadata strategies.

Final Thought

The revelation that ChatGPT fuels its answers with Google Search data—via SerpApi—doesn’t diminish the importance of traditional SEO. On the contrary, it reinforces it. Until AI can be truly independent of Google’s infrastructure, ranking there—and formatting content for AI visibility—remains a cornerstone of discovery in the AI era.


Google Tweaks AI Mode to Encourage Clicks

Google is experimenting with updates to AI Mode aimed at nudging users toward clicking through to sites more often. VP of Product Robby Stein shared on LinkedIn that Google is “experimenting with how and where to show links in ways that are most helpful to users and sites,” with new features rolling out in the coming weeks.

What’s Changing

  • Embedded link carousels now appear in AI Mode responses on desktop, and will be coming to mobile soon.
  • Smarter inline links are being tested—these are links embedded directly within AI-generated text, appearing precisely when users are most likely to want more detail.
  • The Web Guide experiment is expanding from the “Web” tab to the main “All” tab for users in Search Labs, surfacing organized AI-curated link clusters for complex queries.

Why It Matters

Since the introduction of AI Overviews and AI Mode, publishers—especially in informational and news formats—have seen click-through rates drop on summarized content. By making links more visible and context-rich, Google is addressing this head-on, aiming to balance AI summarization with site discovery.

How SEOs Should Adapt

  1. Optimize for recognition: Offer clear, well-structured content (e.g., concise intros, heading clarity, and FAQs) to improve AI citation likelihood.
  2. Strengthen link context: Craft compelling anchor text within your content to improve the appeal of inline links when they appear.
  3. Prioritize depth and uniqueness: Focus on content that AI can’t fully answer without a click—like tools, unique data, and expert insights.
  4. Monitor AI-driven clicks: Track visibility and click behavior on queries known to trigger AI Mode responses.
  5. Be ready to surface: Content included in carousels or prioritized in Web Guide can capture early attention—structure accordingly for quick visibility.

Final Thought

Google’s link-centric AI Mode changes are subtle but impactful: visibility now depends not just on ranking, but on citation, clarity, and context. For SEOs and content teams, this is a reminder that success lies in being not just listed—but highlighted and clicked.


Google Updates SEO Guidance for JavaScript-Based Paywalls

Google has updated its “Fix Search-related JavaScript problems” guide to include explicit instructions about JavaScript-based paywalls. The new recommendation cautions against implementations that deliver the full content in the server response only to hide it with JavaScript. Instead, Google says paywall content should be served only after confirming a user’s subscription status.

Why This Matters

This change highlights a core SEO concern: poorly implemented JavaScript paywalls can confuse Google’s crawler, making it difficult to determine which content is paywalled and which isn’t—leading to indexing issues or potential penalties.

Why the Old Approach Fails

Many JavaScript paywalls inject the full article in the page source and then hide it via scripts. While this works for users post-login, it creates ambiguity for Googlebot, interfering with correct indexing and rendering—possibly triggering cloaking flags.

Best Practice: Get the Timing Right

  • Serve paywalled content only after authentication—the content must not be present in the initial HTML or accessible via JS before login.
  • Use structured data: Implement JSON-LD (e.g., using isAccessibleForFree: false and hasPart pointing to the blocked element) to help Google understand what part of the content is behind the paywall and avoid misinterpretation.
  • Test thoroughly: Use Google’s URL Inspection and Rich Results Test tools to ensure search bots see the correct version of your page.

How SEOs & Publishers Should Act

  1. Audit your paywall implementation—ensure your server doesn’t deliver paywalled content in the initial page load.
  2. Add structured data with accurate cssSelector markers to identify paywalled sections.
  3. Validate in Search Console tools—confirm that Google indexes the accessible content properly and understands what’s restricted.
  4. Ensure crawlability and usability—don’t rely on cloaking or hidden content that misleads search engines.
  5. Educate teams—developers, editors, and technical SEO teams should align on implementation to protect both user experience and search visibility.

Summary

Google’s refined guidance on JavaScript paywalls removes ambiguity about acceptable practices. Paywalled content must be withheld from the crawler’s view until access is granted, and site owners should use structured data to indicate restricted content. This update underscores a broader principle: transparency and technical compliance remain critical to long-term SEO success—even behind the paywall.


Google Resolves Crawl-Rate Bug That Affected Sites in Search Console

Between August 8 and August 28, 2025, many webmasters noticed a sharp decline in crawl activity reported in Google Search Console. Crawl rates dropped from thousands of requests per day to near-zero levels for some sites. The issue appeared across popular hosting platforms including Vercel, WP Engine, and Fastly—suggesting a systemic crawl disruption.

Google’s Response

On August 28, Google’s Search Advocate John Mueller confirmed it was a bug on Google’s end—not caused by site configurations. Crawling has since begun returning to normal levels, and the crawl stats will catch up automatically in the coming days.

Who Saw the Impact—and What Was the Effect

While many sites experienced alarming drops in crawl stats, most did not see any downturn in rankings or traffic. This was likely because Google maintains cached versions of pages and prioritizes indexing quality over immediate crawl volume. The data suggests visibility wasn’t significantly impacted, but smaller or time-sensitive updates may take longer to be reflected.

History and SEO Context

Similar crawl reporting glitches have impacted SEOs in the past—such as Discover performance data inconsistencies. These incidents underscore the importance of not reacting impulsively to reporting anomalies. Maintaining content quality and structural hygiene continues to be essential, regardless of tool hiccups.

What SEOs Should Do Now

  • Avoid reactive changes. A sudden crawl dip may reflect a reporting issue—not a site crash.
  • Use multiple data points. Review indexing, traffic, and log data alongside Console crawl stats.
  • Monitor ongoing recovery. Crawling should normalize within days; track if delays persist.
  • Build resilience. Create crawl-efficient site structures so minor bugs don’t disrupt indexing.
  • Stay informed. Check Google’s status updates or official channels before launching major fixes.

Final Thought

The August crawl blip highlights a new SEO reality: tool reliability isn’t guaranteed. The best defense is a site that’s crawl-efficient, resilient, and well-structured. That way, a temporary data hiccup doesn’t become a long-term ranking issue.


From algorithmic crackdowns to subtle product shifts, this week reinforces the fast-moving reality of SEO in 2025. AI may be changing how people interact with information, but search remains strong—and Google continues refining both its rules and its tools. For marketers and site owners, the lesson is the same: stay adaptable, focus on quality and clarity, and always be ready to pivot when the next wave of change arrives.