When I stress test the dynamic part of the site, I am only getting about
300 requests per second on my test setup. This is using two Dell 1950's
(one for web, one for mysql database). These are very powerful machines
(3.0ghz Xeons, 8 cores each, 16 gig of ram, 15k SAS drives, etc.)
I've done all the standard performance tuning I find when reading
through various websites about django tuning. Is this the performance I
should expect.
When I'm running the load test, the CPU on the web server gets
completely buried (even with 8 cores). The mysql server doesn't seem
loaded at all. Any suggestions on how to find the bottlenecks?
I'm running Ubuntu 7.10 (server, 64 bit version). Django from a very
recent trunk.
Thanks for any advice.
Richard Coleman
rcol...@criticalmagic.com
what pages are you requesting and have you profiled them to understand
what's taking long?
-joe
You must be using manipulators in oldforms. Try newforms.
1. mod_python
2. apache 2.2.4
3. I'm using funkload and ab to measure the requests per second of one
of the base pages within the dynamic part of the website
4. When I hit a static page in the same way (using ab), I get 6500
requests per second.
5. This is without memcached, or any other caching.
Richard Coleman
rcol...@criticalmagic.com
I forgot to mention these:
6. django debug turned off
7. mod_python debug turn off
8. django template debugging turned off
9. apache maxclient cranked up to 1000 (although it never gets close to
that many processes).
Richard Coleman
rcol...@criticalmagic.com
Firebug:
http://www.getfirebug.com/
will show you how long each component of the page takes from request to
response and shows you what components are returned in parallel (among
other really cool features). This can help identify your bottleneck.
For example, if your images are taking longer than expected while the
page itself returns quickly then you may want to serve images with a
dedicated web server, like lighttpd.
Yahoo's 'yslow' tool gives your page a 'grade' and offers advice on how
to improve your grade. It also has other handy bits (JSLint, Empty vs.
Primed cache comparisons).
http://developer.yahoo.com/yslow/
The grades are based on their 'Rules for High Performance Websites':
http://developer.yahoo.com/performance/index.html#rules
Hm, funny, their yslow page scores a 'D' :)
2. It's gigE between the web server and mysql server, on a dedicated
switch. The database is working pretty hard (7000 selects per second),
but doesn't seem to be the bottleneck. The webserver is hammered
(typing any command takes a long time).
3. I'm using prefork MPM on apache with maxclients set to 1000.
We are starting to experiment with caching, but I want to improve the
raw performance of the site as well.
Richard Coleman
rcol...@criticalmagic.com
There's a wiki page on that very process here:
http://code.djangoproject.com/wiki/ProfilingDjango and there's
additionally some information you can get when you enable DEBUG to see
how long the SQL queries are taking for that dynamic page. There's a
nice snippet to help there at
http://www.djangosnippets.org/snippets/93/
-joe
On Tue, Dec 11, 2007 at 01:37:36PM -0500, Richard Coleman wrote:
> Rajesh Dhawan wrote:
> >> When I stress test the dynamic part of the site, I am only getting about
> >> 300 requests per second on my test setup. This is using two Dell 1950's
> >> (one for web, one for mysql database). These are very powerful machines
> >> (3.0ghz Xeons, 8 cores each, 16 gig of ram, 15k SAS drives, etc.)
> >
> > - How are you running your Django app? Mod_python? FastCGI?
> > - What web server are you using?
> > - What's the nature of the dynamic Django request you are measuring?
> > How many DB queries does it make? Is DEBUG mode off?
> > - What kind of numbers do you get when you serve a simple and small
> > HTML file statically from your Web server without Django?
> > - What kind of numbers do you get when the Django view you are
> > benchmarking goes directly to a simple template (i.e. no DB queries)?
> > - Are you using memcache?
>
> 1. mod_python
> 2. apache 2.2.4
> 3. I'm using funkload and ab to measure the requests per second of one
> of the base pages within the dynamic part of the website
> 4. When I hit a static page in the same way (using ab), I get 6500
> requests per second.
> 5. This is without memcached, or any other caching.
I'm not all that qualified to comment as I've never done this sort of testing,
but I do wonder if the performance you are seeing may not be all that
unreasonable, given that no caching is currently being performed? Using
memcached has been known to improve performance dramatically.
Profiling never hurts, but there's nothing wrong with doing the easy stuff
first, right?
-Forest
--
Forest Bond
http://www.alittletooquiet.net
The machine is definitely not swapping. It has 16 gig of ram, and top
never shows more than 7 gig of ram in use when I do the load testing.
I have maxclients set to 1000, but the number of apache processes never
gets close to that value. I've played around with various settings for
maxclient, but as long as I don't set it too low, this never comes into
play.
Richard Coleman
rcol...@criticalmagic.com
Please make 'hello world' view to simplify the troubleshooting.
If it's still "only" 300 r/s without hitting the DB, the problem is
much smaller.
You haven't listed your ab parameters; if you're using maxclients so
high, perhaps you're paying the process startup cost (nearly) each
time, so consider setting startservers high. In general, if you have
a dedicated web server, setting startservers near maxclients isn't a
bad way to go.
We are looking at using the memcached API in our code, and I'm sure we
will get a speedup there. But before we start down that route, I wanted
to make sure we weren't doing any silly on the raw site.
I think my next step is to use the profiler and find out why the site is
so CPU intensive.
Thanks for all the help.
Richard Coleman
rcol...@criticalmagic.com
1. Static content and dynamic are on the same server (that will change
on production). But they are on different apache virtual. Mod_python
is turned off for the static stuff. Hitting only a static page is very
fast (6500 requests per second). Hitting the dynamic side is slow (300
requests per second).
2. It's gigE between the web server and mysql server, on a dedicated
switch. The database is working pretty hard (7000 selects per second),
but doesn't seem to be the bottleneck. The webserver is hammered
(typing any command takes a long time).
Again I suggest a hello-world view as a baseline, but if you really
are doing 7000 queries per second, perhaps you're also constructing a
lot of ORM objects; there's some overhead to the signalling (pre/post
init, pre/post save).
Just a possibility:
If you're launching huge numbers of requests, but all from the same
machine (e.g., the one running ab), your request-generating machine
may be saturating its own bandwidth, which could cause Apache
processes on the web nodes to stall as they wait for you to be able to
receive data, which in turn leads to requests piling up and increased
load on the web node (but not on the DB node, which doesn't notice
anything unusual).
Not saying that's necessarily what's happening, but it's something to
look into. Also, 300 req./sec. for something you apparently haven't
done much profiling on is pretty damned impressive in my book.
--
"Bureaucrat Conrad, you are technically correct -- the best kind of correct."
Have you tried a connection pool?
--
Nic Ferrier
http://www.woome.com - Enjoy the minute!
Nic пишет: