
"something is happening... GSC has been extra slow at loading data since Thursday last, really long 'prompts' are appearing in the data, and impressions are up/positions down again (for me at least)."
"Remember the num=100 situation where Google removed support for the parameter based on AI scrapers and bots scraping the Google SERPs? When they did that, average position surged heavily and impressions dropped (since those scrapers were being excluded). Well, @AnalyticsEdge pinged me about a recent change for some sites. And I'm seeing it too for *some* sites (not all). Right around 12/3 or 12/4, some sites are seeing a big drop in average position while impressions skyrocket again. That leads me to believe some scrapers are now getting through. Check your stats. You might be one of them. Now lets see if Google fights back as part of the cat and mouse game..."
Google removed support for the num=100 parameter to block AI scrapers and bots from scraping search results. That removal caused average position to rise and impressions to fall as scraper traffic was excluded. Around 12/3–12/4 some sites began showing large impression increases and sharp drops in average position, consistent with scraper traffic returning. The effect is inconsistent across sites, with some GSC accounts still unaffected. Search Console loading delays and unusually long query prompts have been reported alongside the metric shifts. The pattern suggests an ongoing cat-and-mouse contest between scraper workarounds and Google countermeasures.
Read at Search Engine Roundtable
Unable to calculate read time
Collection
[
|
...
]