X = ... fetch a large object....
... some processing...
x = ... extract a part of X...
... long running job....
X = ... fetch a large object....
... some processing...
x = ... extract a part of X...function2(x) ->
function2(x).
... long running job...
_______________________________________________
erlang-questions mailing list
erlang-q...@erlang.org
http://erlang.org/mailman/listinfo/erlang-questions
X1 = ... fetch a large object ....
... some processing...
X2 = ... extract a part of X1 ...
erlang:garbage_collect(),
... long running job....
However my suggestion is to, instead of doing a full sweep by the garbage collector (GC) to identify data going out of scope and reclaim, can the program (or rather I) deliberately say I (the calling process) is finished using the said data, so the GC may free that part.
The obvious question is whether you are sure you actually need to optimise to save memory? Premature optimisation and all that. (Actually sensible advice) Maybe reviewing algorithms and datastructures will do the trick for you.
Robert
From my Nexus
I found temporary variables, eg. binary_to_list of a XML data say 100 KB in size (Xmerl needs string), won't get freed for a long period of time without force garbage collection. Therefore when there are about 500 user sessions, each process consuming large memory blocks, makes the system memory usage extremely high. We plan to support a large number of user sessions, say 10000s and this memory consumption is a show stopper for us at the moment.
Good advice.
For the short term, I think the option of hibernating the processes should be mentioned as well - it ensures that dormant session processes don't take up more memory than necessary.
/Erik