Hey,
I need to set up a proxy where only one POST request should be sent to backend at a time. Other requests should be queued up as long as resource permits (memory).
Something like this is good:
http://wiki.nginx.org/HttpLimitReqModuleExcept that it returns 503 right away when limit is reached.
So, I thought I could write something using lua ngix module.
Has anyone done something like this?
Where do I start?
Are there other load balancers or proxy servers that can queue up incoming requests and delegate to backend at certain rate?
I was thinking about using ngx.shared.DICT to store incoming requests and have another ngx.thread that pops request at a time and call ngx.location.capture(backend_uri).
But since I am targeting POST requests, not sure if it is feasible to read each request body and store them up in ngx.shared.DICT
I feel like I'm using the wrong tool or solution for a problem. Or, did not define the problem properly.
Basically, I have hypothesis that the backend (that I don't have control over) has issue with concurrent writes (handling POST requests). And, wanted to validate the hypothesis by sending concurrent POSTs to the proxy that delegates to the backend at a much slower rate.
What would you do if you have a slow backend that can only accept one request at a time, and clients are dumb enough that they don't retry on 503?