I've run into the memory limit with github.com/mzimmerman/sdzpinochle
(a card game AI) due to my algorithm needing to hold so many
permutations of legal outcomes in the game. Running under the memory
profiler really helped. I find it easiest to do that using go test to
setup the profiling. http://blog.golang.org/profiling-go-programs
How big is a thingo.Foo{}?
The first thing I see here is that you're
re-creating the Foo each time. As I understand it, it would be easier
on the garbage collector if instead of creating a new one you created
a zero() method instead which would reset it on each iteration. At
that point, I don't think your code would need to allocate any
additional memory.
Just profile it; I could be completely wrong. In the profiling post:
" Every time FindLoops is called, it allocates some sizable bookkeeping structures. Since the benchmark calls FindLoops 50 times, these add up to a significant amount of garbage, so a significant amount of work for the garbage collector.
Having a garbage-collected language doesn't mean you can ignore memory allocation issues. In this case, a simple solution is to introduce a cache so that each call to FindLoops reuses the previous call's storage when possible."
strings are immutable so you can't reuse their storage so I'd recommend to store your data as []byte and zero the length before loading data again