Cyclic Networks - Number of iterations

50 views
Skip to first unread message

a...@mnet.me

unread,
Sep 7, 2017, 2:28:30 PM9/7/17
to SharpNEAT
Hi,

All praises for this library. I like it very much.

I was playing with it and I came across something that made me scratch my head for some time.
In activation function for CyclicNetworks. Here is code:

for(int i=0; i<_timestepsPerActivation; i++)
            {
                // Loop over all connections. 
                // Calculate each connection's output signal by multiplying its weight by the output value
                // of its source neuron.
                // Add the connection's output value to the target neuron's input value. Neurons therefore
                // accumulate all input value from connections targeting them.
                for(int j=0; j<connectionCount; j++) 
                {
                    Connection connection = _connectionList[j];
                    connection.OutputValue = connection.SourceNeuron.OutputValue * connection.Weight;
                    connection.TargetNeuron.InputValue += connection.OutputValue;
                }

                // Loop over all output and hidden neurons, passing their input signal through their activation
                // function to produce an output value. Note we skip bias and input neurons because they have a 
                // fixed output value.
                for(int j=_inputAndBiasNeuronCount; j<neuronCount; j++) 
                {
                    Neuron neuron = _neuronList[j];
                    neuron.OutputValue = neuron.ActivationFunction.Calculate(neuron.InputValue, neuron.AuxiliaryArguments);

                    // Reset input value, in preparation for the next timestep/iteration.
                    neuron.InputValue = 0.0;
                }
            }



In loop for passing input signal through activation function to output value, neuron output value is always set to the last value and input value set to 0. This way any number of cycles will leave same neuron output and there is no point in going trough multiple cycles.

Am I missing something or is this should be this way?

Thanks.

Colin Green

unread,
Sep 7, 2017, 5:30:37 PM9/7/17
to shar...@googlegroups.com
Hi, 

The loop you refer to takes the already calculated input value for each neuron, and passes it through the activation function to get the output value. The input value is then reset ready for the next activation if the network. 

Which line specifically do you think is a problem?

Colin

--
You received this message because you are subscribed to the Google Groups "SharpNEAT" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sharpneat+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Message has been deleted
Message has been deleted

D

unread,
Sep 8, 2017, 12:19:26 PM9/8/17
to SharpNEAT
Hi, 

From my understanding every iteration should include output value from previous iteration and in this way it is not same if you have 1 or 4 iterations.
In the code above, if I did not make any logic errors, it is the same if you run 1 or multiple iterations because only last one will have effect (because all iteration have same output).
So shouldn't the line in second nested loop be like this:
neuron.InputValue = neuron.OutputValue;

so that on next Iteration line in the first nested loop:

connection.TargetNeuron.InputValue += connection.OutputValue;

will have output from previous iteration?

And after all iterations all finished, then InputValues should be set to 0, so that next network activation will not accumulate values old values that are not involved with current network activation.

Am not sure if I am clear enough?
To unsubscribe from this group and stop receiving emails from it, send an email to sharpneat+...@googlegroups.com.

Colin Green

unread,
Sep 11, 2017, 6:52:20 AM9/11/17
to shar...@googlegroups.com
With the cyclic network each neuron has two state variables associated with it:

neuron.InputValue
neuron.OutputValue


And each connection references both the source and target neuron objects:

connection.SourceNeuron
connection.TargetNeuron


A single iteration of the whole network consists of:

// Loop connections.
foreach(Connection connection)
{
// connection.SourceNeuron.OutputValue here is the output signal
from the previous iteration, so for hidden neurons this is zero to
start with, and for input neurons it is the input values to the
network as a whole.
double connectionOutput = connection.Weight *
connection.SourceNeuron.OutputValue;

// Sum/accumulate the connection output signals onto their target neurons.
connection.TargetNeuron.InputValue += connectionOutput;
}

// Loop neurons (hidden and output neurons only).
foreach(Neuron neuron)
{
// Apply activation function to the accumulated input signal at
each neuron, thus obtaining each neuron's output signal ready for the
next iteration.
neuron.OutputValue = activationFunction(neuron.InputValue);

// Reset/zero the input value now that this iteration is over
with , i.e. InputValue is a working state variable for use within each
iteration only.
neuron.InputValue = 0.0
}


So there's a two stage process, one where neuron output signals from
the previous timestep are propagated to their target neurons, and
another that applies the activation function to get updated neuron
output signals for the current timestep. Then the process repeats for
however many timesteps are required/configured.

This is very different to the acyclic network in which each neuron is
located in a layer and is activated only once as activation progresses
down the layers. In the cyclic network all of the neurons's are
activated simultaneously at each timestep, using neuron output values
from the previous timestep as the input values for the current
timestep.

That is the intention and I believe the code is correct with respect
to that intention.



> So shouldn't the line in second nested loop be like this:
> neuron.InputValue = neuron.OutputValue;


At the start of each iteration/timestep each neuron's state is in
OutputValue, that is the one and only activation level for each
neuron. InputValue exists as a temporary/intermediate state variable
at each neuron for holding the pre-activation level of the neuron.
This is necessary because we need to separate the pre and post
activation levels of each neuron because all of the neurons are
effectively being activated simultaneously. I.e. we need to keep a
copy of the activation level from the previous timestep until we are
sure it is no longer needed (i.e. until we finish the first loop -
over the connections).

So I think the issue here is that your understanding of the intent is
off slightly, rather than there being an error in the core per se.

What do you think?

Colin

D

unread,
Sep 12, 2017, 1:49:45 PM9/12/17
to SharpNEAT
Yeah, now I fully understand.

Thank you very much :)

Colin Green

unread,
Sep 12, 2017, 3:39:03 PM9/12/17
to shar...@googlegroups.com
Cool. No problem.
Reply all
Reply to author
Forward
0 new messages