I'll be completely honest, for my investigation and projects I'm now using Keras (
http://keras.io/) instead of Caffe. It's been a lot easier to me to generate dynamic networks created in code instead of needing configuration files (although Keras can save and load network topologies and parameters in configuration files too).
I spent quite some time making Caffe work for me but since I tried Keras I haven't used Caffe again actually. Caffe is great, but for my projects Keras fitted better.
These are the main important things about Keras for me:
* A trained network can be saved to files (as in Caffe): the topology and the trained parameters.
* Networks are created in very simple Python code (or loaded from files), so they can be created dynamically. That allows me to do hyperparameter optimization (for example with Hyperopt:
http://hyperopt.github.io/hyperopt/) a lot easier, creating new networks with different parameters and sizes from code, not from files that I would have to somehow create in code.
* I don't have to compile, understand, write and read ProtocolBuffers or its files, or try to understand the files in the source code in Caffe, as the docs aren't always completely up to date.
* I can easily create RNN (Recurrent Neural Networks) (for example for time series or text).
* It can run on top of Theano or TensorFlow, taking the advantage of a Nvidia CUDA graphics card (as with Caffe). ...and you could brag around with the fad of TensorFlow telling that you are using it hehe ;) .
* Another Python option not based in Theano that performs very well too is Nervana, but it requires a recent graphics card architecture (from Kepler and onwards) and I needed to be able to run in a Fermi (older) architecture (and I'm not sure about the other points with Nervana).
Here's how I see it (if I was you):
* You will be probably better starting with an Ubuntu 16.04. Any new software and new versions will be available for it, for the next years. I would put my bets on Ubuntu 16.04.
* Have you used Docker or are you interested in Docker? If "yes" to any of those, go with NVIDIA-Docker (and learn Docker if you haven't, it's completely worthwhile). I love Docker and I'm using it everywhere, it solves a whole lot of problems for me, if I could install Docker also in my Android phone I would, and it will be useful for you in a lot of other cases in the future. And there are Docker images with Caffe already.
* If you want to just deploy a Caffe model from the model zoo (for example, an already trained big network for image recognition), go with Caffe, you would just have to make it work and you are done.
* If you aren't very "expert" in Linux (taken from the private email, not from this thread), I think that installing Keras and making it work would be a lot easier than Caffe, with all the C++ needs, protocol buffers, libraries, paths, variables, etc. The installation instructions in the docs worked for me quite straightforward and I had an installation in one day compared to one or more weeks installing Caffe (which ended up in that long guide I made).
* If you want to build, customize and experiment with neural networks from scratch, and would prefer Python than C++, I would go with Keras.
* If you want to try RNNs, Keras.
* If you want hyperparameter optimization with Hyperopt, Keras.
* If you want to fiddle with the network, save intermediate results (for example to plot learning curves), etc. Keras. Caffe can train with Python and you theoretically could get those intermediate results with Python, but training with Python is a lot slower than training with the binaries (C++) and you end up losing the C++ "advantage" (at least that happened the last time I tried). And the C++ advantage is not that notorious compared to Theano for example, if you look at the benchmarks.
But again, if you *just* want to deploy a pre-trained model "as a service" to classify images or something like that, the models from the model zoo are great and they already made the hard work for you.
If you end up using Caffe and make it work for you in Ubuntu 16.04 making some changes to that guide I made, I would greatly appreciate a GitHub pull request with those changes you make (or even more comments here), so that others can take advantage of you new knowlegdge.
And to finish, a quick personal tip for development that you might find useful. I'm not very good at memorizing and knowing all the code, functions, modules, methods, parameters, etc... I use lots of "autocompletion". And I like to be able to run parts of my code to see what to do next and to check intermediate values, etc. I really like how IPython notebooks have autocompletion and running of parts of the code (letting me see intermediate values, etc). But IPython notebook isn't really a "code editor". So right now I'm in love with Atom (
https://atom.io) with the plug-in Hydrogen (
https://atom.io/packages/hydrogen). If the description of my needs fits you too, you may want to try that out. Specially while you are exploring and coding with new Python modules and APIs. And Hydrogen (in Atom) now supports running Python from inside a Docker container (I added that), and it has helped me a lot. That might help you explore the APIs of the modules you end up using.
I'm sorry for not being able to help more, but I hope some of those ideas help you find your solution.
Best regards.