Hi,I am wondering about the following phenomenon: I have a list "interrels" of polynomials in many variables and a list "potential_sols" of potential solutions and I am using the following code to check which are actually solutions (roots). I use the following code
sols=[]
for psol in potential_sols:
for interrel in interrels:
if interrel(*psol)!=0:
break
else:
print(psol)
sols.append(psol)I am not surprised (under my circumstances) that this takes long, though actually it wasn't that bad at first. What surprises me is that it uses huge amounts of memory. The actual data don't take that much memory and the code just linearly runs through all possibilities so it practically shouldn't need any extra memory at all (at most another copy of potential_sols). But instead it had filled the 4 GB physical memory plus 2 GB swap when I canceled it (of course by that time it also was terribly slow, just because of all the swapping). This seems to suggest memory leaks.
Is there any way I could check what part of sage (and which data structures) use how much memory? Are there any known issues about memory leaks related to polynomial rings over finite fields? Is there a garbage collection I have to call manually?
Thanks for any help!Best,
Stefan