Ah, it seems the normalization is wrt to a 0-255 pixel range instead of 0-1. You can load images in 'byte' format, avoiding the extra multiplication with 255
here.
The second link, probably soumith simplified the normalization to mean = 117 and std = 1 just to keep things simple (that's why you don't see a division for the std part because :div(1) its kinda pointless).
There's not a whole lot models in the torch format, but you can use the ones
here, just need to download the right protofiles and reshape images from RGB to BGR format (img= image.load(filename):transpose(1,3)). I've tested some and i've got good results.