I am not quite sure what exactly you want to do; I am guessing you want to classify all images from a HDF5 dataset. This won't work as simple as you are imagining it. There are two basic possibilities:
1. Change the net config to use the HDF5 db in question as input, then just load the network with caffe.Net (not sure if you could also use the caffe.Classifier, have never used that myself) and call net.forward() as many times as there are batches in your HDF5, and for each extract the predictions form the final blobs before calling forward() again. In each call of forward() caffe will load a new batch of data from the HDF5 and pass it through the network.
2. Change the net config by removing all input layers and providing "input" and "input_shape" for each input blob to make a so-called "deploy" config. Then load that config with caffe.Net. Now before you can call forward() you need to fill the input blobs yourself:
net.blobs['input'].data[...] = <my data, one batch>
# possibly you need to reshape the blobs beforehand (change the batchsize in particular)
net.blobs['input'].reshape(N,C,H,W)
In this scenario you will need to read out the data from the hdf5 file yourself. But using h5py that is really simple:
with h5py.File('my.h5', 'r') as f:
firstbatch = f['data'][0:batchsize,:,:,:]
Jan