Thanks again for your prompt response, Steven.
I supposed the logic behind not being able to set the activation
function on the input layer is that it is just a "pass through" layer,
where the inputs are propagated as is. Just guessing here, but
depending on the implementation it might just be the case...thoughts?
Best,
-Harold Gimenez
On Aug 15, 4:04 pm, "Steven Miers" <
steven.mi...@gmail.com> wrote:
> Harold,
>
> Sorry about that. It's actually a bug. I've fixed it just now (ruby-fann
> 0.7.9). You should be able to get the latest version by doing a "sudo gem
> update ruby-fann". It may take a few minutes for the new gem to be
> available on rubyforge.
>
> One caveat is that according the the FANN documentation:
>
> /* Function: fann_set_activation_function
>
> Set the activation function for neuron number *neuron* in layer number
> *layer*,
> counting the input layer as layer 0.
>
> *It is not possible to set activation functions for the neurons in the
> input layer.*
> */
>
> I've also included a unit test for the new/fixed functionality.
>
> -Steven Miers
>