It was thus said that the Great 'Sainan' via lua-l once stated:
> That really was the most polite way for me to put it.
No it wasn't.
> I think in 2025, at least static data services really ought to be able to
> handle millions of requests in a given hour (worst case scenario when
> you're being hugged to death).
That's 300 requests per second. Do you not run a web server? It seems to
me that you do not. I do. And when you have bots that identify themselves
as, and I am not making this up:
Mozilla/5.0 (compatible; Thinkbot/0.5.8; +In_the_test_phase,_if_the_Thinkbot_brings_you_trouble,_please_block_its_IP_address._Thank_you.)
and it comes from 500,000 different IP addresses, then yes, I'm banning it
regardless if I can handle 300 requests per second, or 3,000,000 requests
per second, even if my site has nothing but static files.
> For dynamic data services, I do understand that IP banning is reasonable,
> but trust me, the abusers of public services have A LOT of ranges, so this
> is just not an ideal approach.
So what is an ideal approach, oh ye of of web hosting? robots.txt?
Cloudflair, which is fast becoming a single point of failure for the web?
Something else entirely?
-spc