The thing that I've noticed is that the amount of bot traffic hitting my sites has been steadily increasing. I've addressed this through a combination of robots.txt and blocking certain IP address blocks in the firewall (I block the ones that don't respect crawl-delay or don't respect robots.txt).
It's a never-ending battle against these bots that all seem to be building their own google-scale web databases.
If your costs are coming from frontend instances, the simple solution to that is to just limit the maximum number of those. Although that means when you do get a traffic spike, users will suffer.
-Joshua