My User model which contain 13 column attributes, 1 association proxy for many to many connections and 6 one to many attributes set using lazy='dynamic'. When I tracked its memory usage only 6 attributes were loaded as others were set deffered and memory used was 1.6 mb.
I tried to do rough calculations and thought if one object was taking this much then loading 100 will take 160mb.
Then I created 50 user objects and then tracked memory and to my surprise total memory consumed was only 2 mb!
What is the reason of this?? Are this objects sharing some base which is is around 1.5 mb and actual object is just in kbs?
I tracked another type of object defined using 5 fields took only about 30kb and another object containing only 2 fields was 90 kb. Why is this different behavior in sizes?
Reading about memory usage and sqlalchemy in this group and other places that once python process accumulates memory then they release only once closed I am little confused.
Does it mean when I have loaded 100 objects and their usage is over, after that its memory will not be released back to system??
Final question expiring an object creates a weak reference and can be collected by gc if not used. Does expunging an object does the same?
mainly how can i make memory taken by object free after usage is over.
Thas quite a lot of questions. Thank you for reading.
Regards,
Manav Goel