This is wrong. You can and you should use pretrained weights - except the last layer, because only the last layer is modified. Rest of the network stays the same.
As to everything being zero:
FCN readme explicitly tells you why that happens ("
This is almost universally due to not initializing the weights as needed."). If you really really
really wanted to train from scratch, you'd have to initialize weights somehow else. I checked the FCN prototxts - they do not include any
weight_fillers. This means you created a layer and didn't fill it with anything (no pretrained weights, no random initializations). Inference on any image through such network will result in zeros everywhere. Consult caffe
reference AlexNet model to see how
weight_filler and
bias_filler work.
What you want to do is rename the classifier layer (the one whose num_outputs equals to number of classes) so that its weights are not loaded. But as mentioned above - you must still initialize it by adding fillers. Otherwise you've successfully loaded all weights for feature extraction, but left the classifier at zero - so no matter what your convolutional layers detect, classifier outputs zero all the way.