MySQL total overheat (somewhat complex database)

73 views
Skip to first unread message

lubos

unread,
Jul 28, 2012, 2:44:00 PM7/28/12
to django...@googlegroups.com
Hello,

I have a quite sophisticated database with frequently interconnected tables and on the top level table, Django produces queries like this:

SELECT `data_schedule`.`id`, `data_schedule`.`process_id`, `data_schedule`.`hardware_id`, `data_schedule`.`task_id`, `data_schedule`.`start`, `data_schedule`.`times`, `data_schedule`.`done`, `data_process`.`id`, `data_process`.`string_id`, `data_process`.`operation_id`, `data_string`.`id`, `data_string`.`immutable`, `data_operation`.`id`, `data_operation`.`string_id`, T5.`id`, T5.`immutable`, `data_hardware`.`id`, `data_hardware`.`string_id`, `data_hardware`.`workspace_id`, `data_hardware`.`warmup`, T7.`id`, T7.`immutable`, `data_workspace`.`id`, `data_workspace`.`string_id`, T9.`id`, T9.`immutable`, `data_task`.`id`, `data_task`.`request_id`, `data_task`.`thread_id`, `data_task`.`amount`, `data_request`.`id`, `data_request`.`order_id`, `data_request`.`item_id`, `data_request`.`amount`, `data_order`.`id`, `data_order`.`string_id`, `data_order`.`subject_id`, `data_order`.`type`, `data_order`.`shipped`, `data_order`.`paid`, `data_order`.`end`, T13.`id`, T13.`immutable`, `data_subject`.`id`, `data_subject`.`string_id`, T15.`id`, T15.`immutable`, `data_item`.`id`, `data_item`.`string_id`, `data_item`.`measure_id`, `data_item`.`amount`, T17.`id`, T17.`immutable`, `data_measure`.`id`, `data_measure`.`string_id`, T19.`id`, T19.`immutable`, `data_thread`.`id`, `data_thread`.`string_id`, T21.`id`, T21.`immutable` FROM `data_schedule` INNER JOIN `data_process` ON (`data_schedule`.`process_id` = `data_process`.`id`) INNER JOIN `data_string` ON (`data_process`.`string_id` = `data_string`.`id`) INNER JOIN `data_operation` ON (`data_process`.`operation_id` = `data_operation`.`id`) INNER JOIN `data_string` T5 ON (`data_operation`.`string_id` = T5.`id`) INNER JOIN `data_hardware` ON (`data_schedule`.`hardware_id` = `data_hardware`.`id`) INNER JOIN `data_string` T7 ON (`data_hardware`.`string_id` = T7.`id`) INNER JOIN `data_workspace` ON (`data_hardware`.`workspace_id` = `data_workspace`.`id`) INNER JOIN `data_string` T9 ON (`data_workspace`.`string_id` = T9.`id`) INNER JOIN `data_task` ON (`data_schedule`.`task_id` = `data_task`.`id`) INNER JOIN `data_request` ON (`data_task`.`request_id` = `data_request`.`id`) INNER JOIN `data_order` ON (`data_request`.`order_id` = `data_order`.`id`) INNER JOIN `data_string` T13 ON (`data_order`.`string_id` = T13.`id`) INNER JOIN `data_subject` ON (`data_order`.`subject_id` = `data_subject`.`id`) INNER JOIN `data_string` T15 ON (`data_subject`.`string_id` = T15.`id`) INNER JOIN `data_item` ON (`data_request`.`item_id` = `data_item`.`id`) INNER JOIN `data_string` T17 ON (`data_item`.`string_id` = T17.`id`) INNER JOIN `data_measure` ON (`data_item`.`measure_id` = `data_measure`.`id`) INNER JOIN `data_string` T19 ON (`data_measure`.`string_id` = T19.`id`) INNER JOIN `data_thread` ON (`data_task`.`thread_id` = `data_thread`.`id`) INNER JOIN `data_string` T21 ON (`data_thread`.`string_id` = T21.`id`) ORDER BY `data_schedule`.`id` ASC

MySQL, however, isn't able to process it and after few minutes it holds all processor performance.

I would be glad for any idea.

Lubos

Javier Guerra Giraldez

unread,
Jul 28, 2012, 3:40:42 PM7/28/12
to django...@googlegroups.com
On Sat, Jul 28, 2012 at 1:44 PM, lubos <lubos...@gmail.com> wrote:
> MySQL, however, isn't able to process it and after few minutes it holds all
> processor performance.

that looks like a rather simple star scheme. maybe there are missing
indexes... were those tables created by Django?

--
Javier

lubos

unread,
Jul 28, 2012, 3:59:46 PM7/28/12
to django...@googlegroups.com
Yes, they were.

akaariai

unread,
Jul 28, 2012, 6:35:41 PM7/28/12
to Django users
Are you using select_related()? To me it seems you are just doing too
many joins in one query, and MySQL just can't handle it. If
select_related is the source of this problem, check if using
prefetch_related instead helps.

- Anssi

Cal Leeming [Simplicity Media Ltd]

unread,
Jul 30, 2012, 8:53:10 AM7/30/12
to django...@googlegroups.com
Could you provide us with the models.py for this table? Just the model definitions will suffice.

Can you also provide the snippet of Python code which is generating this SQL

Cal


Lubos

--
You received this message because you are subscribed to the Google Groups "Django users" group.
To view this discussion on the web visit https://groups.google.com/d/msg/django-users/-/xwjqe16wlNYJ.
To post to this group, send email to django...@googlegroups.com.
To unsubscribe from this group, send email to django-users...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/django-users?hl=en.

lubos

unread,
Jul 31, 2012, 6:42:45 PM7/31/12
to django...@googlegroups.com
Hello, I forgot to mention, that this query is generated automatically by Django admin when opening this model admin page.


Dne pondělí, 30. července 2012 14:53:10 UTC+2 Cal Leeming [Simplicity Media Ltd] napsal(a):
Cal

To unsubscribe from this group, send email to django-users+unsubscribe@googlegroups.com.

Kurtis Mullins

unread,
Aug 1, 2012, 2:28:39 PM8/1/12
to django...@googlegroups.com
Since you said this is caused by admin, did you modify the admin for this class to pull in all of these details? If not, you could probably tell it specifically which fields to show to reduce the number of joins it has to execute.

With that being said -- it may be worth-while to look at the way your data is structured and see if you can simplify it. However, I have no idea of the domain of your project and what you've modeled. 

I don't know how much experience you have with data modeling but if you'd like to post your models (like Cal mentioned) then we may be able to offer more suggestions on ways to enhance performance which would probably help in a lot more places than this one instance. Again, I'm not trying to look down at you so please don't take it that way :) There's many out there who are much smarter and more experienced in the realm of data modeling than myself!

Good luck!
Reply all
Reply to author
Forward
0 new messages