'Embedding' object has no attribute 'get_shape'

486 views
Skip to first unread message

lordt...@gmail.com

unread,
May 31, 2017, 8:07:04 AM5/31/17
to Keras-users
I was experimenting with `Embedding` layer, that I need for ignoring some values in sequences (masking).

The code is the following one:

import numpy as np
from keras.layers import Input, Dense, LSTM
from keras.layers.embeddings import Embedding
from keras.layers.merge import Concatenate
from keras.models import Model
from keras.utils import plot_model

trainExs
= np.asarray([ [1, 2, 3], [2, 3, 1]])
trainLabels
= np.asarray([[1, 1, 1], [2, 2, 2]])

print('Examples, shape:', trainExs.shape)
print('Labels, shape:', trainLabels.shape)

W
= [[0, 0, 0], [1, 0, 0], [0, 1, 0], [0, 0, 1]]
symDim
= 3

# E M B E D D I N G S
symbol_emb
= Embedding(symDim+1, symDim,
 weights
=np.asarray(W), trainable=False, mask_zero=True)
symbol_dense
= Dense(symDim, use_bias=True, name='symbol_dense')(symbol_emb)

output_layer
= Dense(symDim, dtype='float32', name='output')(symbol_dense)

# M O D E L
model
= Model(inputs=[symbol_emb], outputs=[output_layer])
model
.compile(loss='mean_squared_error', optimizer='RMSprop', metrics=['accuracy'])
# print(model.summary())



However, when I run it, I get the following error messages:

Examples, shape: (2, 3)
Labels, shape: (2, 3)
Using TensorFlow backend.
Traceback (most recent call last):
 
File "D:/workspace/TESTS/test/testEMb.py", line 26, in <module>
 symbol_dense
= Dense(symDim, use_bias=True, name='symbol_dense')(symbol_emb)
 
File "D:\python\lib\site-packages\keras\engine\topology.py", line 541, in __call__
 
self.assert_input_compatibility(inputs)
 
File "D:\python\lib\site-packages\keras\engine\topology.py", line 450, in assert_input_compatibility
 ndim
= K.ndim(x)
 
File "D:\python\lib\site-packages\keras\backend\tensorflow_backend.py", line 479, in ndim
 dims
= x.get_shape()._dims
AttributeError: 'Embedding' object has no attribute 'get_shape

From the documentation I am not able to understand how to solve it.
I need the embedding for going from symbol indexes (in trainExs) to one-hot encodings, after concatenated by another a layer.

Is there anyone who can help me?

Reply all
Reply to author
Forward
0 new messages