S3 request rate limits

54 views
Skip to first unread message

Rich Jones

unread,
Dec 16, 2019, 11:16:08 AM12/16/19
to Alluxio Users

We are using Alluxio 1.8.2 with S3 as the UFS and alluxio.user.file.readtype.default=NO_CACHE

 

We are experiencing hitting S3 request rate limits while reading a large amount of data from a bucket.  We might request a limit increase from AWS or look at doing something with partitioning but is there anything in the Alluxio config that can throttle request rates to the UFS?


Thanks,

Rich

Bin Feng

unread,
Dec 16, 2019, 4:56:04 PM12/16/19
to Alluxio Users
Hi Rick,

There is not a dedicated rate limiter for S3. However you might be able to achieve the throttling by limit the number of concurrent requests. Try playing with alluxio.underfs.s3.threads.max and see if it helps. Lowering "alluxio.master.rpc.executor.max.pool.size", "alluxio.worker.network.block.reader.threads.max" or "alluxio.worker.network.block.writer.threads.max" which limit the processing of Alluxio RPC requests might also help.
Reply all
Reply to author
Forward
0 new messages