def check_memory(k):
a=get_memory_usage()
silly_function(k)
return get_memory_usage()-a
def silly_function(k):
for i in range(10^k):
2+2
--
You received this message because you are subscribed to the Google Groups "sage-support" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sage-support+unsubscribe@googlegroups.com.
To post to this group, send email to sage-s...@googlegroups.com.
Visit this group at https://groups.google.com/group/sage-support.
For more options, visit https://groups.google.com/d/optout.
You can fill your memory with something simpler
sage: l = range(10**9)
As far as I can see it has nothing to do with Sage or loops. In Python2
the range functions constructs a list. And in the above example, the
list is huge.
def ex_f(q):K = GF(q)P.<x,y,z> = PolynomialRing(K,3)monomials=[x,y,z,x^2,y^2,z^2]for v in K^6:f=1+sum([v*m for (v,m) in zip (v,monomials)])f.factor()
Thank you very much for your prompt replies. I was sure that range(n) creates an iterator instead of the list itself, my bad.In any case, even if the function is creating this list, why is it still stored in memory after the function terminates? It is a local variable, and is not returned so it should be erased from the RAM.
Thank you very much for your prompt replies. I was sure that range(n) creates an iterator instead of the list itself, my bad.In any case, even if the function is creating this list, why is it still stored in memory after the function terminates? It is a local variable, and is not returned so it should be erased from the RAM.
it need not happen, unless you explicitly trigger garbage collectionimport gcgc.collect()see https://docs.python.org/2/library/gc.html for details
Il giorno venerdì 15 dicembre 2017 16:47:02 UTC+1, Nils Bruin ha scritto:
And undoubtedly it did (a list of integers has no circular references, so it can be deleted just based on reference counts), but obtaining memory from and returning memory to the operating system is an expensive operation, so python is probably reluctant to do so.
get_memory_usage only reports how much memory is allocated to the python process, not how much of it python is actually considering as in use.
So there is a discrepancy between the real amount of memory in use and the one allocated but, when I run a process, my hardware limits the allocated one. So, for instance, I can evaluate ex_f(p) just for very small values of p even if the real amount of memory in use is almost zero for any p.
If your routine f_ex runs out of memory then that would indicate a memory leak. If you look at the code for (K^6).__iter__ you'll see it really intends to be an iterator that uses limited memory.