confused on layer.get_weights()

431 views
Skip to first unread message

周迎威

unread,
Jan 22, 2017, 6:27:08 AM1/22/17
to Keras-users
Hi, I just monitored the weight update for a simple mlp model:

model=mlp() # contains one input layer in the format of dense, one hidden layer and one output layer.
weight_origin
=model.layers[0].get_weights()[0]
model
.fit(.....) # with adam optimizer
weight_updated
=model.layers[0].get_weights()[0]
print weight_origin-weight_updated



For the first dense layer, I got a matrix of zeros. I thought the training doesn't change this weight. However, weights in other layers are changed. So I'm confused, why the first layer is unchanged? 
I checked the source code but still got no answer, then I tried monitoring:
model.layers[0].get_weights()[1] #get_weight() returns a list of weights
This time, the weights did changed. So I'm wondering which weight is the "true" weight that does working during training? Why there are two elements in the weight list??

Reply all
Reply to author
Forward
0 new messages