On Mon, Dec 10, 2012 at 8:12 AM, S Ahmed <
sahme...@gmail.com> wrote:
> But say you are running a web application on jetty/tomcat, which in threaded
> in itself.
>
> Why would your redis client need to be threaded also, each web request is on
> its own thread anyhow.
Maybe threads are not what you think they are. The purpose of a thread
is that it shares the same memory space as all other threads in the
same process, so it can have shared global variables for common
resources (like using a thread-safe hash map for a local cache), or
thread-local resources that are not shared.
Presumably Jedis connections are a shared global resource.
> One thing I can think of is, if you get 1000 requests per second, then you
> have 1000 connections into redis.
No. You only have as many connections as are necessary to sustain the
requests, based on the number of sequential commands, the latency of
those commands, and the number of serving threads.
If you only have 10 threads serving those 1k requests/second, you will
only need 10 connections, because each thread will (in the worst case)
need a connection. If you have 1k threads serving those 1k requests in
a second, then you *may* need 1k connections, but only if your typical
request requires 1 full second to complete (or if all 1k requests come
all at the same time, requiring every thread to have a connection
simultaneously). If your request (including all Redis commands) can be
served in 10 ms from Redis, you may only need 10-20 connections to
Redis.
> But if your redis client has a pool, it will actually limit the threads in
> this case to e.g. 20 threads. So it is actually a way to limit the # of
> threads.
No. Connections are released back into the pool when they are done being used.
As a case in point, I've got a web server running 2 processes and 5
threads each (I'm in Python, so heavy threading isn't quite so good).
At times, it will serve 200+ requests/second (I've got a lot of
headroom on that machine), but the most connections I've seen to Redis
from those (10 total) web serving threads over long periods of time is
4 (2 connection for 5 threads in each process). I'm not limiting the
number of connections, but the number of connections still hangs
around 4, because the actual number of concurrent requests to Redis is
about 4 over time. On the other hand, when I'm performing some
resource-intensive process (like a manual re-indexing/re-caching of a
bunch of data, pushing Redis to about 75% processor), latency per
command increases. Due to increased command latency, more requests
overlap, so more connections are necessary. Usually it gets to 6-7
connections, but I have seen it max out to 10 on occasion.
Regards,
- Josiah