I am running Deep Occlusion Framework developed by pierrebaque. In the RunUnaries files i am getting following error.
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-4-4a99bb3ab8e4> in <module>()
1 import UnariesNet
----> 2 uNet = UnariesNet.unariesNet()
/media/kirmani/New Volume/Office/DeepOcclusion-master/DeepOcclusion-master/UnariesNet.py in __init__(self, load_pretrained)
117 self.train_func = theano.function(inputs=[X,t_rois,Ybb,In(p_drop, value=0.5)],
118 outputs=[T.exp(log_p_out),loss], updates=updates_loss_VGG,
--> 119 allow_input_downcast=True,on_unused_input='warn')
120
121 self.test_func = theano.function(inputs=[X,t_rois,Ybb,In(p_drop, value=0.0)],
/home/kirmani/.local/lib/python2.7/site-packages/theano/compile/function.pyc in function(inputs, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input)
318 on_unused_input=on_unused_input,
319 profile=profile,
--> 320 output_keys=output_keys)
321 # We need to add the flag check_aliased inputs if we have any mutable or
322 # borrowed used defined inputs
/home/kirmani/.local/lib/python2.7/site-packages/theano/compile/pfunc.pyc in pfunc(params, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input, output_keys)
440 rebuild_strict=rebuild_strict,
441 copy_inputs_over=True,
--> 442 no_default_updates=no_default_updates)
443 # extracting the arguments
444 input_variables, cloned_extended_outputs, other_stuff = output_vars
/home/kirmani/.local/lib/python2.7/site-packages/theano/compile/pfunc.pyc in rebuild_collect_shared(outputs, inputs, replace, updates, rebuild_strict, copy_inputs_over, no_default_updates)
205 ' function to remove broadcastable dimensions.')
206
--> 207 raise TypeError(err_msg, err_sug)
208 assert update_val.type == store_into.type
209
TypeError: ('An update must have the same type as the original shared variable (shared_var=<TensorType(float32, matrix)>, shared_var.type=TensorType(float32, matrix), update_val=Elemwise{add,no_inplace}.0, update_val.type=TensorType(float64, matrix)).', 'If the difference is related to the broadcast pattern, you can call the tensor.unbroadcast(var, axis_to_unbroadcast[, ...]) function to remove broadcastable dimensions.')
I am new to theano and any help regarding this issue is highly appreciated. Thanks