R-op DownsampleFactorMax

49 views
Skip to first unread message

m_z...@sbox.tugraz.at

unread,
Dec 12, 2011, 11:09:55 AM12/12/11
to theano-dev
Hi,

Before starting with some questions I want to thank everybody for
creating this great framework. I'm really impressed what a great work
you have done here and I think it
pushes the machine learning community a BIG step in the right
direction.
Theano combines simplicity with speed, I really like that!

I know that everyone of you have a lot of things to, so don't bother
if you have no time.
Anyway, while testing some Hessian-Free code I encountered that some
functions are not implemented yet (DownsampleFactorMax needed for
Convolutional Neural Networks).

-----------------------------------------------------------------------------------------------------------------------------------------
Exception: R_op was not implemented for DownsampleFactorMax
operation. Email the mailing list for help
-----------------------------------------------------------------------------------------------------------------------------------------

Some functions like 'softmax' have been discussed in the users-group
and there is a quick fix
-----------------------------------------------------------------------------------------------------------------------------------------
def symbolic_softmax(x):
e = T.exp(x)
return e / T.sum(e, axis = 1).dimshuffle(0, 'x')
-----------------------------------------------------------------------------------------------------------------------------------------
for that, so the problem is solved here (I think it's already a fix
for that one in the recent code).

Regarding the DownsampleFactorMax R-op I have question if somebody is
working on that at the moment? Otherwise I will try to get into this.
Any help would be greatly appreciated.

additionalQuestion: I think from the R-op code will have the same
logic as DownsampleFactorMaxGrad(), with additional code for 'whatever
the R-op' makes here ... (?)

thx, Mat

Frédéric Bastien

unread,
Dec 12, 2011, 5:00:10 PM12/12/11
to thean...@googlegroups.com
On Mon, Dec 12, 2011 at 5:09 PM, m_z...@sbox.tugraz.at
<m_z...@sbox.tugraz.at> wrote:
> Hi,
>
> Before starting with some questions I want to thank everybody for
> creating this great framework. I'm really impressed what a great work
> you have done here and I think it
> pushes the machine learning community a BIG step in the right
> direction.
> Theano combines simplicity with speed, I really like that!

Thanks for the comment!

>
> I know that everyone of you have a lot of things to, so don't bother
> if you have no time.
> Anyway, while testing some Hessian-Free code I encountered that some
> functions are not implemented yet (DownsampleFactorMax needed for
> Convolutional Neural Networks).
> -----------------------------------------------------------------------------------------------------------------------------------------
> Exception:  R_op was not implemented for DownsampleFactorMax
> operation. Email the mailing list for help
> -----------------------------------------------------------------------------------------------------------------------------------------


Most of the lab is at the NIPS conference now. So this won't be done
rapidly I think. If you want to try to make it, you can check those
page for some documentation:

http://deeplearning.net/software/theano/cifarSC2011/extending_theano.html
http://deeplearning.net/software/theano/extending/op.html#R_op
http://deeplearning.net/software/theano/tutorial/gradients.html#jacobian-times-a-vector


Any pull request for that will be appriciated :)

> Regarding the DownsampleFactorMax R-op I have question if somebody is
> working on that at the moment? Otherwise I will try to get into this.
> Any help would be greatly appreciated.

Ask your questions, it is easier to answer questions during break/at
the end of the day then to code:)

> additionalQuestion: I think from the R-op code will have the same
> logic as DownsampleFactorMaxGrad(), with additional code for 'whatever
> the R-op' makes here ... (?)

There is a link between the gradient and the Rop for sure as you
know:) If fact in the url I gived, they talk about a function that
revify your implementation of Rop automatically by computing the
jacobian times a vector automatically to compare the result of the
R_op implementation with the real result. You can have a look at those
tests function to implement a slow version of this function as a first
step if that is helpful to you. Otherwise, you will need to find the
way to express it symbolitically or make a new op to compute it. If
you go the new op direction, you looking at the
DownsampleFactorMaxGrad op is a good idea as a staring point I think.
First, do the python version and test it with the test version. Only
after that you should check to change the c_code to do the requested
work if you want to make it faster. But you don't need to do it.

Fred

m_z...@sbox.tugraz.at

unread,
Dec 15, 2011, 7:01:45 AM12/15/11
to theano-dev
Thx ;)

I will have a try writing this Trait. I hope will have some time free
over christmas ;)

cheers, Mat

On Dec 12, 11:00 pm, Frédéric Bastien <no...@nouiz.org> wrote:
> On Mon, Dec 12, 2011 at 5:09 PM, m_zo...@sbox.tugraz.at


>
> <m_zo...@sbox.tugraz.at> wrote:
> > Hi,
>
> > Before starting with some questions I want to thank everybody for
> > creating this great framework. I'm really impressed what a great work
> > you have done here and I think it
> > pushes the machine learning community a BIG step in the right
> > direction.
> > Theano combines simplicity with speed, I really like that!
>
> Thanks for the comment!
>
>
>
> > I know that everyone of you have a lot of things to, so don't bother
> > if you have no time.
> > Anyway, while testing some Hessian-Free code I encountered that some
> > functions are not implemented yet (DownsampleFactorMax needed for
> > Convolutional Neural Networks).
> > -----------------------------------------------------------------------------------------------------------------------------------------
> > Exception:  R_op was not implemented for DownsampleFactorMax
> > operation. Email the mailing list for help
> > -----------------------------------------------------------------------------------------------------------------------------------------
>
> Most of the lab is at the NIPS conference now. So this won't be done
> rapidly I think. If you want to try to make it, you can check those
> page for some documentation:
>

> http://deeplearning.net/software/theano/cifarSC2011/extending_theano....http://deeplearning.net/software/theano/extending/op.html#R_ophttp://deeplearning.net/software/theano/tutorial/gradients.html#jacob...

Rodrigo Benenson

unread,
Sep 4, 2015, 4:08:24 AM9/4/15
to theano-dev
Was this issue ever solved ?

I just hit the same corner...

regards,
rodrigob.
Reply all
Reply to author
Forward
0 new messages