tensorflow tf.gradient returns None (deep dream tutorial on cifar 10)

941 views
Skip to first unread message

jrafae...@gmail.com

unread,
Aug 29, 2017, 2:42:27 AM8/29/17
to Discuss
please reply on stack overflow such that a larger community benefits from the answer:

I'm trying to adapt tf's deep dream notebook to a cifar10 model trained using tf's cifar10 tutorial. I've adapted cifar10_multi_gpu_train.py to work with my own data and such that it uses two placeholders for image and labels inputs. The model seems to be training properly as the loss decreases and the classification accuracy increases.

UPDATE: I've also tried using a frozen graph obtained using the single GPU cifar10 tutorial in tf.

I load the saved meta-graph and the input tensor this way:

# load the saved model
saver = tf.train.import_meta_graph('/tmp/cifar10_train/model.ckpt-3000.meta')
graph = tf.get_default_graph()
sess = tf.InteractiveSession(graph=graph)
saver.restore(sess, tf.train.latest_checkpoint('/tmp/cifar10_train/'))
graph_def = graph.as_graph_def()

I get the tensors for the input image and conv2 weights with this:

t_input = graph.get_tensor_by_name('inputs:0')
layer = graph.get_tensor_by_name("conv2/weights:0")

I print the tensor of the gradient of conv2 weight w.r.t. to the image input tensor to see if this gradient is dependent on the image input.

channel = 32 # picking some feature channel to visualize
t_obj = layer[:,:,:,channel]
t_score = tf.reduce_mean(t_obj)
t_grad = tf.gradients(t_score, t_input)
print(t_grad)

Unfortunately, the output of print(t_grad) is [None]

Any advice is very welcome!

jrafae...@gmail.com

unread,
Aug 29, 2017, 11:06:17 PM8/29/17
to Discuss, jrafae...@gmail.com
Solved. I was using the wrong tensors.
Reply all
Reply to author
Forward
0 new messages