You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Keras-users
Is there a way to load weights for specific layers of a model? For example, if I trained a network with a particular architecture, how would I load those weights into a new network whose architecture is a subset of the first? Lets say my first network has layers A->B->C->D and my second has A->B->C if I just use the load_weights() function, there will be an error because it can't find layer D in my network, but for my application I would like D to just be ignored.
Second related question, is there a difference between how weights are stored for a Graph vs. Sequential model? For my particular case I want to use the subset of the weights from a Graph model in a Sequential model. I could always just define my sequential model as a Graph instead, but I'm guessing there are some speedups to using Sequential.
Mayank Sharma
unread,
Mar 17, 2016, 6:14:01 PM3/17/16
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Keras-users
One way of doing it is that can load your complete model with the pretrained weights and then use model.pop which pops the last layer and then recompile this model
for eg:
model1 - A->B->C
model1.compile(wieght_file) #compile the model by passing weights file
model1.pop() #pop the last layer
model1.compile()# again compile the model
J Rao
unread,
Mar 17, 2016, 9:54:00 PM3/17/16
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to keras...@googlegroups.com
Hi,:
Take a look at save_weights and load_weights method in models.py,
they're fairly straight forward. And yes Graph and Sequentialhas
slightly different ways of saving weights but I think in your case
Sequentialis easier to work with, basically each layer is identified by
its index in the model, if you know the index of the layer to be ignored
you can write an alternative version of load_weights to skip it very easily.
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Keras-users
Thanks for your suggestions! I ended up making my own version of load_weights, as the layers that I want to remove aren't always the last ones. It's a little annoying figuring out which layers to skip over since they are stored only by index number and not by layer name, but the code ended up not looking too bad in the end, and it worked. For anyone who comes across this problem again in the future, this is my code:
def load_custom_weights(model, filepath, layer_indices): f = h5py.File(filepath, mode='r') g = f['graph'] weights =[g['param_{}'.format(p)]for p in layer_indices] model.set_weights(weights) f.close()
This is specifically for loading from a Graph model to a Sequential model, but with minor tweaks it can do other stuff. You pass in the layers you want to keep to layer_indices as a list, so something like layer_indices=[0,1,2,6,7]