--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/8776dba5-225e-43a9-ad8b-4c0519e231d9%40googlegroups.com.
>>> from keras.models import Sequential
>>> from keras.layers import Convolution2D
>>> m = Sequential()
>>> m.add(Convolution2D(8,3,3, input_shape=(1,10,10)))
>>> m.compile(loss="mae", optimizer="sgd")
>>> c = m.predict(np.random.rand(1,1,10,10))
>>> c.shape
(1, 8, 8, 8)
>>> c = m.predict(np.random.rand(1,1,20,20))
>>> c.shape
(1, 8, 18, 18)
>>> m.fit(np.random.rand(100,1,10,10), np.random.rand(100,8,8,8))
<keras.callbacks.History object at 0x7f349919fb10>
>>> m.fit(np.random.rand(100,1,12,12), np.random.rand(100,8,10,10))
<keras.callbacks.History object at 0x7f3489d07a90>
m.add(Convolution2D(8,3,3, input_shape=(1,10,10)))
model = Sequential()
model.add(Convolution2D(8, 3, 3, input_shape=np.shape(X_train[np.newaxis,0,:,:])))
model.add(Permute((3,2,1)))
model.add(Reshape((-1,np.prod(model.layers[-1].output_shape[-2:]))))
<------- Masking ? ------->
model.add(SimpleRNN(output_dim))
model.add(TimeDistributedDense(nb_classes,activation='softmax'))
optimizer = RMSprop(lr=learning_rate)
model.compile(loss='categorical_crossentropy', optimizer=optimizer)
In my training set, each image has a different shape. Since the output of the Convolution2D will have different shape depending on the imput image, after the reshape I will have sequences of different length. Is it possible to deal with that variable length sequence generated inside the neural network ? --
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/c2485083-50b7-4c08-ad01-58663eda579d%40googlegroups.com.
The simplest thing is to use fit_generator and feed same-sized batches. The alternative, is to implement some sort of masking and padding, but how to do it correctly depends exactly on what exactly you are doing inside the network.
On 21 July 2017 at 12:52, asphalt <asfi...@gmail.com> wrote:
Hey,I have a training set of images, each of which has a different size. I do not want to lose data by resizing the images. How can one feed the data to model.fit() funcion, as it accepts only arrays and a single array consisting of multiple arrays( of different dimensions) is not supported by numpy.Thanks for all the help!!
On Thursday, February 4, 2016 at 10:33:48 PM UTC+5:30, deco...@gmail.com wrote:I would to process the output of the Convolution2D with a RNN (Similar approach to the Image Caption Generation with attention paper http://arxiv.org/pdf/1502.03044v2.pdf ) so I think I could use a Masking layer to deal with the different output size.
However I can't deal with different input shape to the network, What would be the best way to use Convolution2D on images with different size? Up to now I could only think of padding the numpy arrays of the input images with zeros to make them of the same size. However this can be very ineficient if there is a considerable variation in the size of the images (as can be the case).
Can someone imagine a way that it can be possible to work with different size images other than padding them? I'm not an expert in Theano but I think it can't be done. But it would be great if it was possible since I think it would probably avoid unnecessary usage of memory and computing time.
If someone thinks its possible and can give me some hints I would really appreciate it. (Also if someone is certain that it can't be done with current Theano and/or Tensorflow )
--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.