I'm implementing a value encoder similar to gob. While bench-marking my code against gob, I noticed some unexpected memory allocations with a particular type of data.
There is a simple function foo() receiving a []uint reflect.Value as input. It calls the Interface() method to get back a []uint value that it can then use.
When the input value is a []uint, the Interface() method doesn't allocate.
When the input value is a []uint of a [][]uint (a matrix), the Interface() method allocates the slice header on the heap.
I was wondering if this is just a limitation of optimization or if there was a constrain imposing the allocation.
In my code and gob's code (see the helpers functions in enc_helpers.go) the slice returned by Interface() is only used locally.