batch_size = 16
original_dim = 784
latent_dim = 2
intermediate_dim = 128
epsilon_std = 0.01
nb_epoch = 40
x = Input(batch_shape=(batch_size, original_dim))
h = Dense(intermediate_dim, activation='relu')(x)
And this is how my model eventually looks like:
>>> vae.summary()
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
=======================================================================================
input_1 (InputLayer) (16, 784) 0
____________________________________________________________________________________________________
dense_1 (Dense) (16, 128) 100480 input_1[0][0]
____________________________________________________________________________________________________
dense_2 (Dense) (16, 2) 258 dense_1[0][0]
____________________________________________________________________________________________________
dense_3 (Dense) (16, 2) 258 dense_1[0][0]
____________________________________________________________________________________________________
lambda_1 (Lambda) ((16, 2), 2) 0 dense_2[0][0]
dense_3[0][0]
And this is how I train it:
vae.fit(x_train, x_train,
shuffle=True,
nb_epoch=nb_epoch,
batch_size=batch_size,
validation_data=(x_test, x_test))
I don't think I miss anything since the model coding part is basically a direct copy from Keras blog example. But thank you for your reply anyway.
Michael
On Thursday, June 2, 2016 at 3:01:53 AM UTC-5, Constantin Ei wrote:
Hi,
seems to me, that you simply have to define "batch_size". Your error message says, it is required but missing.
-- Constantin