/me prefers writing adblocking proxies in node.
--
Job Board: http://jobs.nodejs.org/
Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nod...@googlegroups.com
To unsubscribe from this group, send email to
nodejs+un...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
Of course it has to be under 1 second :)
125M per day = ~ 1.4K per second
> On every request I have to write it to a database (IP, some query
> parameters, etc.) and increment a number of unique and nonunique
> impressions/clicks/leads.
> Unfortunately I have to use mySql database.
>
> It will run on servers with 2x E5620 CPU (2,4 GHz, 4 cores), 32GB
> RAM.
> Two servers will be used by ad serving and tracking system and
> another
> two by a database.
>
> Could you advice me if node.js could handle it?
> Maybe some of you could share experiences about production usage of
> node.js - rps numbers, latency, used database and servers.
- You probably need to have several node instances, with a proxy like
nginx in front (not sure cluster is the way to go for now).
- Using nginx will allow for compression/caching without you doing
anything in your node code.
- Be carefull if you're going to save an access history in your
database
or a log file of every request, it will get huge very fast.
---
Diogo R.
Node can probably handle this load given the right coding and
clustering. If the machine is fast enough and the request simple
enough, a single process can almost handle it.
This problem is exactly the type of thing that node is well suited for.
You're right, but I did not say latency or rps. I was talking about
requests that have to be handled (average) per second.
> Typically, the more concurrent requests, the slower each request
> gets.
This depends on several things. If the requests are of the same type
and
can be cached, (although, while the first does not complete, node will
probably receive similar requests and have to process them all, but
this
depends on the proxy), if the database connection is kept alive (I
suppose
it will), if every request doesn't need uncached access to disk, ...
---
Diogo R.