Wipe out dropout operations from TensorFlow graph

1,133 views
Skip to first unread message

kometa....@gmail.com

unread,
Nov 1, 2016, 7:33:05 AM11/1/16
to Discuss
Hi,


I have a trained freezed graph that I am trying to run on an ARM device. Basically, I am using contrib/pi_examples/label_image, but with my network instead of Inception. My network was trained with dropout, which now causes me troubles:


Invalid argument: No OpKernel was registered to support Op 'Switch' with these attrs.  Registered kernels:
  device
='CPU'; T in [DT_FLOAT]
  device
='CPU'; T in [DT_INT32]
  device
='GPU'; T in [DT_STRING]
  device
='GPU'; T in [DT_BOOL]
  device
='GPU'; T in [DT_INT32]
  device
='GPU'; T in [DT_FLOAT]


 
[[Node: l_fc1_dropout/cond/Switch = Switch[T=DT_BOOL](is_training_pl, is_training_pl)]]


One solution I can see is to build such TF static library that includes the corresponding operation. From other hand, it might be a better idea to eliminate the dropout ops from the network in order to make it simpler and faster. Is there a way to do that?

Thanks.

Asher Newcomer

unread,
Nov 1, 2016, 10:50:16 AM11/1/16
to kometa....@gmail.com, Discuss
There may be a more elegant way to accomplish this, but setting the keep_probability to 1 in the dropout layer (disabling the dropout) seems to do the trick.

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+unsubscribe@tensorflow.org.
To post to this group, send email to dis...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/b44aefd3-b1fd-4d62-933a-c17102a21ebb%40tensorflow.org.

kometa....@gmail.com

unread,
Nov 1, 2016, 11:09:03 AM11/1/16
to Discuss, kometa....@gmail.com
No, I mean completely remove operations related to dropout, so they are not present in the graph anymore.
Otherwise, ARM-version of TF does not allow me to run in.

Indeed, I already found how to do that similar to graph_utils.remove_training_nodes.


On Tuesday, November 1, 2016 at 3:50:16 PM UTC+1, Asher Newcomer wrote:
There may be a more elegant way to accomplish this, but setting the keep_probability to 1 in the dropout layer (disabling the dropout) seems to do the trick.
On Tue, Nov 1, 2016 at 7:33 AM, <kometa....@gmail.com> wrote:
Hi,


I have a trained freezed graph that I am trying to run on an ARM device. Basically, I am using contrib/pi_examples/label_image, but with my network instead of Inception. My network was trained with dropout, which now causes me troubles:


Invalid argument: No OpKernel was registered to support Op 'Switch' with these attrs.  Registered kernels:
  device
='CPU'; T in [DT_FLOAT]
  device
='CPU'; T in [DT_INT32]
  device
='GPU'; T in [DT_STRING]
  device
='GPU'; T in [DT_BOOL]
  device
='GPU'; T in [DT_INT32]
  device
='GPU'; T in [DT_FLOAT]


 
[[Node: l_fc1_dropout/cond/Switch = Switch[T=DT_BOOL](is_training_pl, is_training_pl)]]


One solution I can see is to build such TF static library that includes the corresponding operation. From other hand, it might be a better idea to eliminate the dropout ops from the network in order to make it simpler and faster. Is there a way to do that?

Thanks.

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+u...@tensorflow.org.

Pete Warden

unread,
Nov 1, 2016, 11:24:09 AM11/1/16
to kometa....@gmail.com, Discuss
Are you using a high-level framework that lets you remove the dropout when you build the model? For example tf.slim allows you to specify training=False if I remember correctly.

Otherwise you can look at remove_training_nodes(), but you'll need to be careful to adjust any weights that are affected by subsequent dropouts. If you do get that working, I'd be interested in a PR!

To unsubscribe from this group and stop receiving emails from it, send an email to discuss+unsubscribe@tensorflow.org.

To post to this group, send email to dis...@tensorflow.org.

Pete Warden

unread,
Nov 1, 2016, 12:50:37 PM11/1/16
to kometa....@gmail.com, Discuss
you'll need to be careful to adjust any weights that are affected by subsequent dropouts.

Apologies, I got this part wrong. The way we implement dropout in TensorFlow is scaling-free, so you should be able to safely remove it from the graph without affecting other nodes. 

Alex Rothberg

unread,
Feb 28, 2017, 1:17:13 AM2/28/17
to Discuss
Did you ever figure out a solution to removing dropout from the graph so that the model could be loaded on an ARM?

andre...@gmail.com

unread,
Mar 31, 2017, 4:52:42 AM3/31/17
to Discuss
I also have this issue with the dropout... it is a bit unnerving as really the possibility to remove it should come together with the transform tool...

@Alex Rothberg: how do you define
node_name_from_input(i) in your proposed solution?

arot...@4combinator.com

unread,
Mar 31, 2017, 9:45:39 AM3/31/17
to Discuss, andre...@gmail.com
`node_name_from_input` and `node_from_map` can be found in `tensorflow.python.tools.optimize_for_inference_lib`
Reply all
Reply to author
Forward
0 new messages