managing load towards remote resources during asynchronous requests

6 views
Skip to first unread message

Valerio Tesei

unread,
May 24, 2018, 9:59:39 AM5/24/18
to nodejs
Hi There,

I'm trying to learn node and I've come across a problematic question, used to use ml during my C days, I thought it may be good to ask the community.

As exercise I've built a small system to read from a database and push data to a remote server.

Something simple, I thought...

I've 20.000 elements that need to go from a local database to a remote rest api.

in PHP I would:
  1. load from database
  2. process the data
  3. post digest to remote endpoint
Attempting to do this in NodeJS I've encountered a big issue, the remote server was denying the requests due a too high request rate.

Didn't took me too much time to understand I was flooding the remote endpoint.

The only solution I've found is to execute a cascade of promises with a timer polling the requests status and allowing only 1 request at the time.
  1. load from database
  2. push to an array
  3. use setTimeout to execute the polling fn
  4. every time a request is completed, execute callback for next element
    1. each callback sets either error or done for the element in the q
The pace becomes 1 request at the time, as fast as possible (within ~tickMs intervals)

I could add a concurrency counter to execute X requests at the time, but it does not address my concerns just works them around.

I know is barbaric, that's why I am here.


I did not use events, setTimeout was faster to use ;)

My point is, how do you manage efficiently how often execute asynchronous operations? how would you solve my problem?

Any comment about the code is welcome, I'd love feedback from experienced developers,

Cheers,
Reply all
Reply to author
Forward
0 new messages