Caveat: I don't know if any of this is best practices. The code referenced below is a quick hack that seems to work for my purposes but that's it. You've been warned.
If you mean you want to run the net in your own code where the input is provided not from a database, but as a blob object, the process is roughly as follows.
1. Create an instance of a caffe::Net instance, providing the prototxt file path or a NetParameter instance (take your pick) to the constructor.
2. Call Net::CopyTrainedLayersFrom(), passing in the path to the .caffemodel you created when training. and
3. Call Net::Forward, passing in the input blob(s), and it will return the output blobs from the net.
4. Use the output blobs.
But there's a bit more to it. The net's prototxt (or NetParameter, if you're not providing a prototxt file) needs to be a bit different than what you used for training: Remove the data layers, assuming we're not reading the inputs from a database, but rather providing the inputs as blobs. The name of the input is added at the top of the prototxt, along with the blob's shape (see link to example). The loss layers are removed, and the output of net is left unconnected, so it will be appear in the vector of blobs returned by Net::Forward(). Since the data layers were removed, if they contained transformation such as scaling RGB values 0->255 down to 0->1, then you will need to do that in code. You'll see code where I used caffe::DataTransformer for that.
Here's a link to example code, and partial net prototxt snippet:
https://gist.github.com/jyegerlehner/d269d8b12bc3b0273b4aThe code is more complicated than what you probably need, because the output blob in this case is another image, not category labels. And most of the code is dealing with the blob that comes out and converting it from blob->Datum->CvMat->jpg file. But if you look for the Net::Forward call and see what leads up to it, it should be pretty obvious.
I hope that's more helpful than confusing.