Free tensor memory in torch lua after using it

12 views
Skip to first unread message

shab

unread,
Oct 16, 2017, 10:33:29 PM10/16/17
to torch7

Hello, I'm new to torch and cuda and am getting confused now. 
I define a sample big cudaTensor as follows:
a = torch.CudaTensor(1,224,1040,960):fill(124.15)
This uses about 1.1 G of GPU memory, and the change can be seen both by nvidia-smi and with command cutorch.getMemoryUsage(1) (I have one GPU)
Now if I resize the storage and the tensor as follows:
a:storage():resize(1)
a:resize(0)
cutorch.getMemoryUsage(1) shows that the memory is being freed, but it's not the same in nvidia_smi. (collectgarbage() does not make any changes either). 

Can anyone help me to figure out what the problem is? and what should I do to free memory on torch Lua after not needing a cudaTensor? replacing it by nil and calling garbage collector is not working. 
Thanks
Reply all
Reply to author
Forward
0 new messages