Creating 1 network from 2 different network models

24 views
Skip to first unread message

Aleksander Kużel

unread,
Feb 24, 2017, 1:39:19 PM2/24/17
to Caffe Users
Hi all, 

I have 2 different network models which I'm trying to connect so they will work together. 

Let's say that we have a picture of a dog 
For example, NM1 predicts that it's a cat on the picture with a probability 0.52 and that it's a cat with a probability 0.48. 
NM2 predicts that it's a dog with a probability 0.6 and that it's a cat with a probability 0.4.   

NM1 - will predict wrong 
NM2 - will predict correctly

NM1 + NM2 - connection will predict correctly (because 0.48 + 0.6 > 0.52 + 0.4)

That's the best way I can explain what result I'm looking for. 


So, I have 2 different network models and in the end, I connect last layers from NM1 and NM2 with ConcatLayer.

layer {
  name
: "concat"
  bottom
: "fc8"
  bottom
: "fc8N"
  top
: "out"
  type
: "Concat"
  concat_param
{
    axis
: 1
 
}
}


And it's even working. Network is training, however very slow. 

Thus, my question is, does it work like it supposed too? 

Or maybe I just don't understand what concat layer do and it's connecting layers in some other way? 

Thank you for your answers!

Przemek D

unread,
Mar 2, 2017, 4:22:53 AM3/2/17
to Caffe Users
Let's follow your example. NM1 outputs [0.52 0.48] and NM2 outputs [0.4 0.6]. If you do Concat, you get simply [0.52 0.48 0.4 0.6] - probably not what you're looking for. If you want to add vectors and get [0.92 1.08] you want to use the Eltwise layer.
Note that this makes sense assuming fc8 contains probabilities. Actually the layer (before passing through SoftmaxWithLoss) will contain raw network output, so for example [352.53 335.08]. Since fc8N might output vastly different values (for example [34.61 70.22]), which would make adding them potentially pointless - in this case you might want to pass those through a Softmax first, then Eltwise, and finally onto the actual loss function. Just guessing that though because I've never tried that.
Reply all
Reply to author
Forward
0 new messages