[ratelimit] running multiple ratelimit servers

58 views
Skip to first unread message

marc.b...@gmail.com

unread,
May 12, 2020, 4:04:43 PM5/12/20
to envoy-users
This question is regarding: https://github.com/envoyproxy/ratelimit
We are currently running a single ratelimit server and many clients sending/receiving data over gRPC (about 30-40 clients to 1 server).
The issue we are seeing, is that the server can't keep up with the request rate from the clients. gRPC requests end up getting queue'd up on the server with a blacklog that can grow up to 30mins behind i.e the client is sending request data from 30min ago.
We're running the ratelimit server in k8s, so I'm wondering if I can run multiple of them at once to handle the load/request rates.

The set-up would be as follows:
(30-40) clients -> k8s load balancer -> (2-4) servers handling requests -> redis backend

Looking at the code, I don't think this should be an issue, specifically when disabling cache so that everything goes to redis (and we are doing an atomic increment on the cache key in redis)

disabling cache via:
```
LOCAL_CACHE_SIZE_IN_BYTES="0"
```

Was wondering if anyone knows if it is safe to do so, or if there might be some concurrency issues.
Also, would be curious, how do folks usually run the server for large environments with a large request rate/sec.

Cagri Ersen

unread,
Oct 27, 2020, 5:16:53 PM10/27/20
to envoy-users
Same here.
I'd like to run multiple rate limit service on k8s. 
Any clues about scalability for a dense production environment?  

Marc Bassil

unread,
Nov 10, 2020, 10:45:49 PM11/10/20
to envoy-users
well if you disable cache, you can scale the rate limit service and it should work properly. For a dense production environment, we had to manually test things to be sure, but we're running in k8s environment, so that made scaling easy.
Reply all
Reply to author
Forward
0 new messages