I have a Multi Task Network with two similar branches and a pre-trained network with only branch (which is also same).
I want to initialize the weights of the layers in the two branches(in my multi task network) with the weights of the layers in my pre-trained network.
Now, I can initialize one of the branch correctly by using the same name for the layers as in the pre-trained network.
But, I have to keep the names of the layers in the other branch different, and thus those layers won't take the pre-trained weights.
Also, I don't want to share the weights in the two branches. So, giving the same name to the weights in the corresponding layers in the two branches won't work.
Is there a nice way/hack to do this ?
PS: I would want to avoid Network Surgery, but any comments, explaining a nice way to do it, are also welcome.