Couple of links which might be of interest ...
http://www.jeffknupp.com/blog/2012/02/14/profiling-django-applications-a-journey-from-1300-to-2-queries/
http://gun.io/blog/fast-as-fuck-django-part-1-using-a-profiler/
ymmv
Mike
> 1. DB Connection pooling. (Might address time associated with opening
> / closing connection)
> 2. It sounds like Django may soon be supporting keeping a connection
> open for a set period of time. (Same as above)
> 3. Switching my properties to actual columns in the DB. I should be
> able to keep these consistent since many of the tables are actually
> static information. I think there's one point where saving occurs
> for any of the dynamic DB information that goes into
> the calculation. (Should help with some overhead -- a complex query
> would be reduced to a simple query).
> 4. Django's cache framework. (Could make significant improvements --
> but this cache is only kept for a certain period of time. Some
> users will still have to wait.)
> 5. Database tuning (Don't know much about this...)
> 6. It seems like it might be inefficient that each property is
> individually moving through the first three tables (is this the
> right way to say this??) since the prop_id isn't used until we get
> to m2mfield_set_all(). I could condense the number of tables that
> need to be traversed However, it seems like this would make the #
> of calls to the DB equal to 2*n+1 rather than 2*n where n is the
> number of prop_ids I'm using.
> 7. Maybe I need to check out the query this is making and write my own
> custom query if it's being inefficient?
>
> I guess my questions are:
>
> * Is that line of code terrible? Should I be doing this a better way?
> * If not, how do I do some time profiling to determine which of the
> above I should do? Or should I just do each one and see if it improves?
>
>
https://groups.google.com/d/msgid/django-users/9f1ff8a8-140a-4071-9c9f-551b55a2b204%40googlegroups.com.