tanh activation function in pylearn2/theano

333 views
Skip to first unread message

Yifei Chen

unread,
Apr 14, 2014, 2:52:27 PM4/14/14
to pylear...@googlegroups.com
Hello all, 

In this document http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf, the hyperbolic tangent activation function is suggested to be 1.7159 tanh(2/3*x). I wonder whether this is adopted in pylearn2/theano. I traced the "Tanh" class in pylearn2 MLP module, but found it's pointing to something in thenao like this: 

@_scal_elemwise_with_nfunc('tanh', 1, 1)
def tanh(a):
    """hyperbolic tangent of a"""

I wonder how this works. And is "1.7159 tanh(2/3*x" adopted in practice?

Thanks!

Yifei
Message has been deleted

Ian Goodfellow

unread,
Apr 14, 2014, 7:14:51 PM4/14/14
to pylear...@googlegroups.com
As our docstring says, we just use tanh, not a rescaling of it. 
--
You received this message because you are subscribed to the Google Groups "pylearn-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pylearn-user...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


--
Sent from Gmail Mobile

Yifei Chen

unread,
Apr 14, 2014, 9:57:35 PM4/14/14
to pylear...@googlegroups.com
In practice, does tanh suffice? In the document the author argue that the rescaled tanh is better.


On Monday, April 14, 2014 4:14:51 PM UTC-7, Ian Goodfellow wrote:
As our docstring says, we just use tanh, not a rescaling of it. 

Il lunedì 14 aprile 2014, Yifei Chen <yif...@uci.edu> ha scritto:
Hello all, 

In this document http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf, the hyperbolic tangent activation function is suggested to be 1.7159 tanh(2/3*x). I wonder whether this is adopted in pylearn2/theano. I traced the "Tanh" class in pylearn2 MLP module, but found it's pointing to something in thenao like this: 

@_scal_elemwise_with_nfunc('tanh', 1, 1)
def tanh(a):
    """hyperbolic tangent of a"""

I wonder how this works. And is "1.7159 tanh(2/3*x" adopted in practice?

Thanks!

Yifei

--
You received this message because you are subscribed to the Google Groups "pylearn-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pylearn-users+unsubscribe@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Ian Goodfellow

unread,
Apr 15, 2014, 5:01:29 PM4/15/14
to pylear...@googlegroups.com
In practice, people have switched to using ReLUs instead of sigmoidal activation functions of any kind.


To unsubscribe from this group and stop receiving emails from it, send an email to pylearn-user...@googlegroups.com.

Yifei Chen

unread,
Apr 16, 2014, 5:05:55 AM4/16/14
to pylear...@googlegroups.com
Oh.. Do you mean rectified linear?

Ian Goodfellow

unread,
Apr 16, 2014, 9:08:49 AM4/16/14
to pylear...@googlegroups.com

Yifei Chen

unread,
Apr 17, 2014, 7:29:46 PM4/17/14
to pylear...@googlegroups.com
Thank you for the cues!

pso...@gmail.com

unread,
Sep 9, 2015, 12:01:56 PM9/9/15
to pylearn-users
I had the same question. I am trying to implement using pylearn2 the architecture of a convolutional network that uses the tanh activation function. And yes they are using the 1.7159 tanh(2/3*x) to center it arround 0 and with values between -1 and 1. Can you please give me a pointer on where shall I look in order to make the changes in pylearn to use this activation? Also since using such an activation I guess the cost can not be the same as the sigmoid. And based on the documentation for convnets the cost is hardcoded to be the cost for sigmoids. Any help on what I need to change in the code to change it to a different cost function?

Thanks

huane...@gmail.com

unread,
Apr 19, 2016, 3:05:43 AM4/19/16
to pylearn-users
在 2014年4月17日星期四 UTC-6下午5:29:46,Yifei Chen写道:
> Thank you for the cues!
>
> On Wednesday, April 16, 2014 6:08:49 AM UTC-7, Ian Goodfellow wrote:
> http://en.m.wikipedia.org/wiki/Rectifier_(neural_networks)
>
> Il mercoledì 16 aprile 2014, Yifei Chen <yif...@uci.edu> ha scritto:
>
>
> Oh.. Do you mean rectified linear?
>
> On Tuesday, April 15, 2014 2:01:29 PM UTC-7, Ian Goodfellow wrote:
>
> In practice, people have switched to using ReLUs instead of sigmoidal activation functions of any kind.
>
>
>
> 2014-04-15 1:57 GMT+00:00 Yifei Chen <yif...@uci.edu>:
>
>
>
> In practice, does tanh suffice? In the document the author argue that the rescaled tanh is better.
>
>
>
>
> On Monday, April 14, 2014 4:14:51 PM UTC-7, Ian Goodfellow wrote:
> As our docstring says, we just use tanh, not a rescaling of it. 
>
>
>
> Il lunedì 14 aprile 2014, Yifei Chen <yif...@uci.edu> ha scritto:
>
>
>
>
> Hello all, 
>
>
> In this document http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf, the hyperbolic tangent activation function is suggested to be 1.7159 tanh(2/3*x). I wonder whether this is adopted in pylearn2/theano. I traced the "Tanh" class in pylearn2 MLP module, but found it's pointing to something in thenao like this: 
>
>
>
>
> @_scal_elemwise_with_nfunc('tanh', 1, 1)
> def tanh(a):
>     """hyperbolic tangent of a"""
>
>
> I wonder how this works. And is "1.7159 tanh(2/3*x" adopted in practice?
>
>
>
>
>
> Thanks!
>
>
> Yifei
>
>
>
>
> --
>
> You received this message because you are subscribed to the Google Groups "pylearn-users" group.
>
> To unsubscribe from this group and stop receiving emails from it, send an email to pylearn-user...@googlegroups.com.
>
>
> For more options, visit https://groups.google.com/d/optout.
>
>
>
> --
> Sent from Gmail Mobile
>
>
>
>
>
>
>
>
> --
>
> You received this message because you are subscribed to the Google Groups "pylearn-users" group.
>
> To unsubscribe from this group and stop receiving emails from it, send an email to pylearn-user...@googlegroups.com.
>
> For more options, visit https://groups.google.com/d/optout.
>
>
>
>
>
>
>
>
> --
>
> You received this message because you are subscribed to the Google Groups "pylearn-users" group.
>
> To unsubscribe from this group and stop receiving emails from it, send an email to pylearn-user...@googlegroups.com.
>
>
> For more options, visit https://groups.google.com/d/optout.
>
>
>
> --
> Sent from Gmail Mobile

hi, i want to know how to switch sigmoid to tanh .Can you tell me how to do ?
Reply all
Reply to author
Forward
0 new messages