Hi Edward,
I will mostly defer to actual system administrators - there are some really smart folks in this forum and hopefully they might chime in.
My first suggestion would be to have Library Host review the webserver logs, and use Nginx rules to outright block IP ranges that are consistently proving to be problem agents by sending repeated requests. Of those prior forum threads you found in your original message, one included the command our system admin use to search the Nginx access logs, and the other one from Jim Adamson showed a possible configuration block that you can use to block the bad actors you find. Also one user mentioned 3 specific bots they blocked that immediately brought down the traffic,
here.
Alternatively, a quick search for "block bots with Nginx" turns up dozens of results. Perhaps discuss some of these options with your hosting provider?
You can also try adding a robots.txt file to your site (also noted in one of those linked threads), though of course the worst of the bot actors don't respect these...
Finally, I did see this other thread (
here) that included some tips for getting a third-party tool,
Anubis, set up with AtoM.
Hopefully others can share their experiences, tips, tricks, and workarounds!