There is no such layer to my knowledge. However, you can make it yourself - tutorial on
loss layers mentions that you can make caffe use any layer (capable of backpropagating) as loss if you assign it a new parameter:
loss_weight. I think you could make an L1 loss using an Eltwise layer (
this answer shows how to use it to subtract two blobs) followed by AbsVal and then an InnerProduct. Just initialize with a constant type weight filler of 1 and zero
lr_mult and
decay_mult so it didn't change during learning, then have it output a single number, which will then be your loss. The way I see it, Eltwise+Abs computes |y-h(x)| and then InnerProduct would return sum over all items. I never tried this myself though, so good luck ;)