set_activation_function_layer

25 views
Skip to first unread message

Harold

unread,
Aug 15, 2008, 3:10:41 PM8/15/08
to ruby_fann
After having constructed a neural net and while setting its
parameters, I get the following error:

Calling `set_activation_function_layer': wrong number of arguments (2
for 1) (ArgumentError)

The intent is to set the activation function of the input layer. The
RDoc specifies two parameters. What am I missing?

This is how the method is being called:

in_data = RubyFann::TrainData.new(:filename => 'data.train'))
test_data = RubyFann::TrainData.new(:filename => 'data.test'))
fann = RubyFann::Standard.new(
:num_inputs => 10,
:hidden_neurons => 6,
:num_outputs => 1
)
fann.set_train_stop_function(:mse)
#this is the line causing the headaches:
fann.set_activation_function_layer(:sigmoid, 0) #0 = input layer
fann.set_activation_function_output(:sigmoid)
[... etc ...]

Steven Miers

unread,
Aug 15, 2008, 4:04:50 PM8/15/08
to ruby...@googlegroups.com
Harold,

Sorry about that.  It's actually a bug.  I've fixed it just now (ruby-fann 0.7.9).  You should be able to get the latest version by doing a "sudo gem update ruby-fann".  It may take a few minutes for the new gem to be available on rubyforge.

One caveat is that according the the FANN documentation:

/* Function: fann_set_activation_function

   Set the activation function for neuron number *neuron* in layer number *layer*,
   counting the input layer as layer 0.
  
   It is not possible to set activation functions for the neurons in the input layer.
*/

I've also included a unit test for the new/fixed functionality.

-Steven Miers

Harold

unread,
Aug 15, 2008, 4:17:48 PM8/15/08
to ruby_fann
Thanks again for your prompt response, Steven.

I supposed the logic behind not being able to set the activation
function on the input layer is that it is just a "pass through" layer,
where the inputs are propagated as is. Just guessing here, but
depending on the implementation it might just be the case...thoughts?

Best,

-Harold Gimenez


On Aug 15, 4:04 pm, "Steven Miers" <steven.mi...@gmail.com> wrote:
> Harold,
>
> Sorry about that.  It's actually a bug.  I've fixed it just now (ruby-fann
> 0.7.9).  You should be able to get the latest version by doing a "sudo gem
> update ruby-fann".  It may take a few minutes for the new gem to be
> available on rubyforge.
>
> One caveat is that according the the FANN documentation:
>
> /* Function: fann_set_activation_function
>
>    Set the activation function for neuron number *neuron* in layer number
> *layer*,
>    counting the input layer as layer 0.
>
>    *It is not possible to set activation functions for the neurons in the
> input layer.*
> */
>
> I've also included a unit test for the new/fixed functionality.
>
> -Steven Miers
>

Steven Miers

unread,
Aug 15, 2008, 5:20:59 PM8/15/08
to ruby...@googlegroups.com
Harold,

In FANN, the activation function is applied to the sum of the inputs (multiplied with the weights).  Since the input layer doesn't itself really have inputs, that makes sense.  So, in your case, you may want to set the activation function for the first hidden layer.  I believe I usually set the same activation function for all of the hidden layers at once, which is why I never discovered that bug.

I just checked, and gem should be available now.

Good luck!
-Steven

Harold

unread,
Aug 15, 2008, 5:26:11 PM8/15/08
to ruby_fann
Makes sense. I also use the same activation function for all hidden
layers (so far), for which I'm using "set_activation_function_hidden".
Works like a champ.

Thanks for your help,

-Harold

On Aug 15, 5:20 pm, "Steven Miers" <steven.mi...@gmail.com> wrote:
> Harold,
>
> In FANN, the activation function is applied to the sum of the inputs
> (multiplied with the weights).  Since the input layer doesn't itself really
> have inputs, that makes sense.  So, in your case, you may want to set the
> activation function for the first hidden layer.  I believe I usually set the
> same activation function for all of the hidden layers at once, which is why
> I never discovered that bug.
>
> I just checked, and gem should be available now.
>
> Good luck!
> -Steven
>
Reply all
Reply to author
Forward
0 new messages