I have a setup with two different worker types using two different queues, and the jobs being processed by one set of workers on one queue seem to be impacting the performance of the other set of workers on a different queue.
The first set of work is processing a relatively large number of scheduled tasks that are each a relatively small amount of work. It's somewhat sporadic, but let's say we get bursts of up to several hundred jobs within a few seconds time, and each job runs for less than a second. We have three worker processes working on this queue, and these workers are launched with this command:
The second set of work handles tasks in response to users interacting with our UI, and is much less busy. We get maybe a dozen jobs a minute, and each job should take less than a second to run as it just makes a several indexed database lookups. We have one worker process on this queue, which is launched with:
We usually run "-c 5" but to simplify things for troubleshooting I set it to 1 for now.
Also note, each of these is running in it's own kubernetes pod with 2Gb memory and 3 CPU's allocated to them.
I'm seeing jobs in the second worker completing very quickly (<100ms) around half the time, but the other half quite slowly (>2s). When I look at the slow jobs, I noticed that database queries that usually take <1ms are taking upwards of 400ms.
Initially I thought it was database and/or connection pool contention, but after simplifying the configuration during troubleshooting, I noticed that these slow queries happen even if there's only one job running at a time. I also noticed that, by running the top command, the bundle process is sporadically using 100% of the CPU even when there are no jobs to run in this queue. The times that bundle is taking 100% of CPU also correlates to the jobs with slow queries.
I also figured out that the sporadic slow-down goes away if I run the `-q configuration` workers against a separate Redis database than the `-q invoke` workers are using. But getting that working for us would be problematic since we need these jobs to be queued from the same source.
It appears that the `-q configuration` worker threads are being consumed by i/o of incoming scheduled jobs, even though they are bound for a different queue. When I run the workers with `config.logger.level = Logger::DEBUG` the worker spits out many "DEBUG: enqueued schedule" messages for jobs in the other queue.
Is there a way to prevent workers on one queue from getting bogged down by job metadata in another queue?
Thanks in advance for any help or pointers.
2021-09-28T21:05:28.083Z 1 TID-gpbvq809p DEBUG: Got TTIN signal
2021-09-28T21:05:28.150Z 1 TID-gpbvq809p WARN: Thread TID-gpbvq809p
2021-09-28T21:05:28.150Z 1 TID-gpbvq809p WARN: /api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/cli.rb:164:in `backtrace'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/cli.rb:164:in `block (2 levels) in <class:CLI>'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/cli.rb:161:in `each'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/cli.rb:161:in `block in <class:CLI>'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/cli.rb:176:in `handle_signal'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/cli.rb:111:in `launch'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/cli.rb:96:in `run'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/bin/sidekiq:12:in `<top (required)>'
/api/vendor/bundle/ruby/2.6.0/bin/sidekiq:23:in `load'
/api/vendor/bundle/ruby/2.6.0/bin/sidekiq:23:in `<top (required)>'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/cli/exec.rb:74:in `load'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/cli/exec.rb:74:in `kernel_load'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/cli/exec.rb:28:in `run'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/cli.rb:463:in `exec'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/vendor/thor/lib/thor/command.rb:27:in `run'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/vendor/thor/lib/thor/invocation.rb:126:in `invoke_command'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/vendor/thor/lib/thor.rb:387:in `dispatch'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/cli.rb:27:in `dispatch'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/vendor/thor/lib/thor/base.rb:466:in `start'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/cli.rb:18:in `start'
/usr/local/bin/bundle:30:in `block in <main>'
/usr/local/lib/ruby/site_ruby/2.6.0/bundler/friendly_errors.rb:124:in `with_friendly_errors'
/usr/local/bin/bundle:22:in `<main>'
2021-09-28T21:05:28.250Z 1 TID-gpbvq809p WARN: Thread TID-gpbwgdl9d
2021-09-28T21:05:28.316Z 1 TID-gpbvq809p WARN: /api/vendor/bundle/ruby/2.6.0/gems/activerecord-5.2.2/lib/active_record/connection_adapters/abstract/connection_pool.rb:301:in `sleep'
/api/vendor/bundle/ruby/2.6.0/gems/activerecord-5.2.2/lib/active_record/connection_adapters/abstract/connection_pool.rb:301:in `block (2 levels) in run'
/api/vendor/bundle/ruby/2.6.0/gems/activerecord-5.2.2/lib/active_record/connection_adapters/abstract/connection_pool.rb:300:in `loop'
/api/vendor/bundle/ruby/2.6.0/gems/activerecord-5.2.2/lib/active_record/connection_adapters/abstract/connection_pool.rb:300:in `block in run'
2021-09-28T21:05:28.316Z 1 TID-gpbvq809p WARN: Thread TID-gpbx4hkp5 heartbeat
2021-09-28T21:05:28.322Z 1 TID-gpbvq809p WARN: /api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/launcher.rb:130:in `sleep'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/launcher.rb:130:in `start_heartbeat'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:16:in `watchdog'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:25:in `block in safe_thread'
2021-09-28T21:05:28.322Z 1 TID-gpbvq809p WARN: Thread TID-gpbx4hkkp scheduler
2021-09-28T21:05:28.422Z 1 TID-gpbvq809p WARN: /api/vendor/bundle/ruby/2.6.0/gems/hiredis-0.6.3/lib/hiredis/ext/connection.rb:19:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/hiredis-0.6.3/lib/hiredis/ext/connection.rb:19:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/connection/hiredis.rb:54:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:265:in `block in read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:253:in `io'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:264:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:123:in `block in call'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:234:in `block (2 levels) in process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:375:in `ensure_connected'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:224:in `block in process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:312:in `logging'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:223:in `process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:123:in `call'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:1618:in `block in zrem'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:50:in `block in synchronize'
/usr/local/lib/ruby/2.6.0/monitor.rb:230:in `mon_synchronize'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:50:in `synchronize'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:1617:in `zrem'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/scheduled.rb:25:in `block (2 levels) in enqueue_jobs'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/scheduled.rb:15:in `each'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/scheduled.rb:15:in `block in enqueue_jobs'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq.rb:97:in `block in redis'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:65:in `block (2 levels) in with'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `handle_interrupt'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `block in with'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `handle_interrupt'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `with'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq.rb:94:in `redis'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/scheduled.rb:14:in `enqueue_jobs'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/scheduled.rb:77:in `enqueue'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/scheduled.rb:68:in `block in start'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:16:in `watchdog'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:25:in `block in safe_thread'
2021-09-28T21:05:28.522Z 1 TID-gpbvq809p WARN: Thread TID-gpbx4hkbt processor
2021-09-28T21:05:28.623Z 1 TID-gpbvq809p WARN: /api/vendor/bundle/ruby/2.6.0/gems/hiredis-0.6.3/lib/hiredis/ext/connection.rb:19:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/hiredis-0.6.3/lib/hiredis/ext/connection.rb:19:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/connection/hiredis.rb:54:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:265:in `block in read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:253:in `io'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:264:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:123:in `block in call'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:234:in `block (2 levels) in process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:375:in `ensure_connected'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:224:in `block in process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:312:in `logging'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:223:in `process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:123:in `call'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:212:in `block in call_with_timeout'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:285:in `with_socket_timeout'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:211:in `call_with_timeout'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:1179:in `block in _bpop'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:50:in `block in synchronize'
/usr/local/lib/ruby/2.6.0/monitor.rb:230:in `mon_synchronize'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:50:in `synchronize'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:1176:in `_bpop'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:1221:in `brpop'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/fetch.rb:36:in `block in retrieve_work'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq.rb:97:in `block in redis'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:65:in `block (2 levels) in with'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `handle_interrupt'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `block in with'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `handle_interrupt'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `with'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq.rb:94:in `redis'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/fetch.rb:36:in `retrieve_work'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/processor.rb:89:in `get_one'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/processor.rb:99:in `fetch'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/processor.rb:82:in `process_one'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/processor.rb:71:in `run'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:16:in `watchdog'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:25:in `block in safe_thread'
2021-09-28T21:05:28.723Z 1 TID-gpbvq809p WARN: Thread TID-gpbx4hk7x scheduler
2021-09-28T21:05:28.821Z 1 TID-gpbvq809p WARN: /api/vendor/bundle/ruby/2.6.0/gems/hiredis-0.6.3/lib/hiredis/ext/connection.rb:19:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/hiredis-0.6.3/lib/hiredis/ext/connection.rb:19:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/connection/hiredis.rb:54:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:265:in `block in read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:253:in `io'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:264:in `read'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:123:in `block in call'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:234:in `block (2 levels) in process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:375:in `ensure_connected'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:224:in `block in process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:312:in `logging'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:223:in `process'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis/client.rb:123:in `call'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:1576:in `block in zadd'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:50:in `block in synchronize'
/usr/local/lib/ruby/2.6.0/monitor.rb:230:in `mon_synchronize'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:50:in `synchronize'
/api/vendor/bundle/ruby/2.6.0/gems/redis-4.1.0/lib/redis.rb:1570:in `zadd'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-cron-1.1.0/lib/sidekiq/cron/job.rb:24:in `block in should_enque?'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq.rb:97:in `block in redis'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:65:in `block (2 levels) in with'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `handle_interrupt'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:64:in `block in with'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `handle_interrupt'
/api/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.2.2/lib/connection_pool.rb:61:in `with'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq.rb:94:in `redis'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-cron-1.1.0/lib/sidekiq/cron/job.rb:20:in `should_enque?'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-cron-1.1.0/lib/sidekiq/cron/job.rb:41:in `test_and_enque_for_time!'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-cron-1.1.0/lib/sidekiq/cron/poller.rb:28:in `enqueue_job'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-cron-1.1.0/lib/sidekiq/cron/poller.rb:15:in `block in enqueue'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-cron-1.1.0/lib/sidekiq/cron/poller.rb:14:in `each'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-cron-1.1.0/lib/sidekiq/cron/poller.rb:14:in `enqueue'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/scheduled.rb:68:in `block in start'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:16:in `watchdog'
/api/vendor/bundle/ruby/2.6.0/gems/sidekiq-5.2.5/lib/sidekiq/util.rb:25:in `block in safe_thread'