Hi everyone,
We’re currently managing several AtoM websites, and over the past while, we’ve been experiencing a significant increase in traffic from AI bots — including GoogleBot, ChatBot, AmazonBot, AliCloud, Meta-Crawler, and others.
From our access logs, these bots are making frequent requests, often using older versions of Chrome as the user agent. This has led to high CPU usage and is starting to impact server performance.
This seems to be a growing and common issue. I’m considering blocking such traffic based on User-Agent patterns or known bot signatures (especially those using outdated browser versions commonly associated with scraping/AI indexing).
Has anyone else faced a similar challenge? If so, I’d greatly appreciate any advice or solutions you’ve implemented to detect, limit, or block excessive AI bot traffic without affecting legitimate users or SEO.
Have you tried adding a 'robots.txt' file to root of your atom page? Search crawlers, at least from decent search pages, usually respect what you define in that file.
Roberto Greiner
--
You received this message because you are subscribed to the Google Groups "AtoM Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ica-atom-user...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/ica-atom-users/d56dee50-262d-4cd2-8206-e2faa2e6b81bn%40googlegroups.com.
-- ----------------------------------------------------- Marcos Roberto Greiner Os otimistas acham que estamos no melhor dos mundos Os pessimistas tem medo de que isto seja verdade James Branch Cabell -----------------------------------------------------
--