Hello, I am using django 1.3 on apache 2.* on windows server 2008.
mysql 5.*
When i try to access a view that does this: def view(request):
len(User.objects.all()) return HttpResponse()
i have about 9K results on that. i am doing len and not .count,
because i want to test the server.
the server becomes reaally slow.
using aws ec2 & rds. so rolling out that. i monitor every query for high performance. but Users all is very simple query.
what do u think it could other than those?
thanks for all the help
--
You received this message because you are subscribed to the Google Groups "Django users" group.
To view this discussion on the web visit https://groups.google.com/d/msg/django-users/-/gTlmPOgGsIkJ.
To post to this group, send email to django...@googlegroups.com.
To unsubscribe from this group, send email to django-users...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/django-users?hl=en.
i have 9000 rows. and using mod_wsgi on apache2. what am i doing wrong? i tried to switch to nginx+fcgi. and tried ubuntu srrver as well. same results.
any ideas?
Do you have DEBUG=False in your settings?
--
Ramiro Morales
ofcourse... i am using django since dec 10, and now i dont know what to do
ok, then that len(User.objects.all()) is not an appropriate benchmark, right?
what you're doing is a view that fetches all 9K records from the DB,
without any filtering. will your views do that? usually you should
filter in the DB, where it can benefit from any indexes, and process
only those records that are relevant for that view.
if you _have_ to do heavy processes to big portions of the data, then
you should do some denormalization, preprocessing, or queue
management.
why don't you explain what do you want to achieve?
--
Javier
what perpesfully used. for heavy system performance
I don't think you're getting any valuable data on your application's
actual performance: you basically made a view that just loops over
9000 items. Of course it will be slow if you keep reloading it!
Try using a tool like "ab"(Apache Benchmark) or tsung to hit real
application pages quickly and concurrently.
well, if you do a heavy and memory intensive task, then doing it ten
times is ten times as much process and ten times as much memory. i
think it's ok to take 20 times as much time.
> my team member, tried to find out the problem, and his conclusion is -
> python+django is not fast. and i DONT like that. its not right!
no, the conclusion is that doing slow things is not fast
--
Javier
it took 20 times more for EACH request. that means that if 20 users accross the web ask for differrent heavy data- instead of taking a second for each one, every user will wait for 20 seconds before the response will answer.
If you're using MyISAM tables, that's a big performance killer. Every
new or expired session is going to get an exclusive table lock on the
session table. That will block reads for everyone else until the new row
is inserted. In the meantime, Apache is either forking and spawning new
threads, and you're creating more MySQL connections, which in turn is
consuming more RAM. Eventually, you'll run out of RAM and start hitting
swap/virtual memory. That only makes things worse. You have a contrived,
long-running query that only exacerbates the problem because while that
query is running, inserts or updates have to wait. All you've proved is
that you can create a cascade effect, which is hardly surprising.
> my team member, tried to find out the problem, and his conclusion is
> - python+django is not fast. and i DONT like that. its not right!
I don't see how you can jump to that conclusion based on the evidence
you presented. Django powers some very busy sites on quite modest
hardware comparatively speaking but Django isn't going to make
inefficient code or poor architectural choices, like MyISAM tables,
magically better. You need to do some real profiling, not some contrived
test that proves that running slow queries is, well... slow.
--
Regards,
Clifford Ilkay
Dinamis
1419-3230 Yonge St.
Toronto, ON
Canada M4N 3P6
thanks! have any advice for good performance? and profiling?
--
You received this message because you are subscribed to the Google Groups "Django users" group.
To post to this group, send email to django...@googlegroups.com.
To unsubscribe from this group, send email to django-users+unsubscribe@googlegroups.com.
Failing that, run a middleware or other logging tool to tell you slow
pages. Then test with django debug toolbar. Rinse and repeat.
--
You received this message because you are subscribed to the Google Groups "Django users" group.
To post to this group, send email to django...@googlegroups.com.
To unsubscribe from this group, send email to django-users...@googlegroups.com.
Here's a couple:
Contrived tests don't prove anything.
MySQL blows for concurrency, it is heavily reliant on underlying IOPS.
Your team member is wrong to assume "python+django is not fast".
Re-implement your benchmark in PHP*, and observe the same results. The
performance of any web application largely comes down to what you are
doing in the database and how you are doing. The choice of framework
is largely irrelevant, you can do stupid things in any
language/framework.
Cheers
Tom
* I am now anticipating having a non-comparable PHP benchmark shoved
in my face. Django constructs objects for each user in that list**,
any PHP equivalent benchmark should do the same.
** No, that isn't Django being daft, it is what you instructed it to
do with len(User.objects.all()).
so the total time was 150 seconds? you said it was 10 parallel
requests, then 20 times longer is still not bad.
On Tue, Apr 17, 2012 at 3:55 AM, Tom Evans <teva...@googlemail.com> wrote:
> Contrived tests don't prove anything.
absolutely.
if you set a purposely slow task, don't expect it to be fast.
heavy, memory intensive tasks are not supposed to be done
concurrently, and definitely not in the request/response loop. that's
why we have offline tasks and queue managers.
--
Javier