Fine-tuning Vgg-19. Error with Flatten() and input shape

294 views
Skip to first unread message

brian.matth...@gmail.com

unread,
Sep 9, 2016, 2:12:46 AM9/9/16
to Keras-users
I'm fine-tuning VGG-19 and only changing the fully connected classifier for now.

I created the base model without the top layer:
base_model = keras.applications.VGG19(include_top = False, weights = 'imagenet')

I am trying to recreate the following, but with a very simple binary classifier for now.

# Classification block
x = Flatten(name='flatten')(x)
x = Dense(4096, activation='relu', name='fc1')(x)
x = Dense(4096, activation='relu', name='fc2')(x)
x = Dense(1000, activation='softmax', name='predictions')(x)


Here's where the issue comes into play

x = base_model.output
x = Flatten()(x) 
x = Dense(256, activation='relu', name='fc1')(x)
x = Dropout(0.5)(x)
predictions = Dense(1, activation = 'sigmoid', name = 'predicts')(x)
model = Model(input= base_model.input, output = predictions)

When 'include_top = false', the output_shape[1:] of the base model is (512, None, None) instead of a (512,7,7) tensor that can be flattened. The only thing I can find that converts a 4D tensor with 'None' to a 2D tensor is Global Pooling, but that doesn't seem like the correct step to take considering a MaxPool layer precedes the fully connected section. Does anyone know the correct approach to customizing the fully connected classifier with the Functional API when the top of the VGG19 model is not included?

Thanks,
Brian

justin...@gmail.com

unread,
Nov 7, 2016, 9:16:28 AM11/7/16
to Keras-users
Hi, I have met the same questions. Do you have figured it out?

在 2016年9月9日星期五 UTC+8下午2:12:46,Brian Holligan写道:

Erukala Uttam

unread,
Mar 24, 2017, 6:50:38 AM3/24/17
to Keras-users
try providing the parameter
input_shape=(224, 224, 3) or input_shape=(224, 224, 3) in keras.applications.VGG19 based on theano or tensorflow 
Reply all
Reply to author
Forward
0 new messages