Excessive traffic - mostly bots - exploring ways to rate limit requests

101 views
Skip to first unread message

Agustina Martinez-Garcia

unread,
Jan 9, 2025, 4:16:34 AM1/9/25
to DSpace Technical Support
Dear all,

We continue to experience huge performance issues caused by very high bot traffic and we are now exploring options for rate limiting.

We use Apache in front of our DSpace, and are looking at rate limiting options there. Does anybody use mod_evasive or similar? If so, what is the experience, would you be willing to share details on a configuration that works well for you?

Thanks so much!
Agustina

DSpace Technical Support

unread,
Feb 10, 2025, 11:14:21 AM2/10/25
to DSpace Technical Support
Hi Agustina,

I realize this is an older message now, but I wanted to highlight for you (and others experiencing high bot traffic) the new DSpace 8.1 and 7.6.3 releases.  Both these releases include major improvements to how Server-Side Rendering (SSR) is processed in the User Interface.  We *believe* these SSR improvements should help with bot traffic, as the goal of these improvements are to *limit* which pages are accessible to bots.  Most bots cannot process Javascript, which means that they should not be able to access pages which do not undergo SSR.

More on 8.1 and 7.6.3 in their release notes:

Tim
Reply all
Reply to author
Forward
0 new messages