Two-factor synaptic plasticity rule

16 views
Skip to first unread message

Emma

unread,
Aug 1, 2024, 6:22:04 AM8/1/24
to ANNarchy

Dear Julien, 

 

I have one more question…

 

I want to implement a two-factor learning rule in a rate-based network. As an example I am using a network with three populations. 

 

Pop2 and pop3 would be connected via proj2_3. The weight of proj2_3 should be plastic, depending on the spike rate of pop1, pop2 and some learning rate lr (dw/dt = pop1.r * pop2.r *lr). 

 

My initial idea was to implement a projection with a plastic synapse model from pop1 to proj2_3. I’m inserting a (not working) code example below to illustrate. However, only populations seem to be accepted as projection pre and post. 

 

I also considered somehow declaring the rate of pop1 globally accessible, but I am not sure if this would be possible, as it would have to be updated at every dt. 

 

I also found an earlier question https://groups.google.com/u/3/g/annarchy/c/dO3BKs3dR-A/m/N1_uxGZRCQAJ. But due to the spiking nature of that network I can’t quite connect the dots. 

 

I hope my goal is understandable. 

 

Example: 

lr = Constant('lr', 0.1)


input_pop1 = Population(1, Neuron(parameters="r=10.0"))
input_pop2 = Population(1, Neuron(parameters="r=5.0"))

pop1 = Population(1, LeakyIntegrator)
pop2 = Population(1, LeakyIntegrator)
pop3 = Population(1, LeakyIntegrator)

syn2_3 = Synapse(equations=""" psp = w * pre.r """) #default 
syn_plast = Synapse(equations=""" dw / dt = pre.r * lr * post.psp """)

proj1 = Projection(input_pop1, pop1,target="exc").connect_all_to_all(1.0)
proj2 = Projection(input_pop2, pop2, target="exc").connect_all_to_all(1.0)
proj2_3 = Projection(pop2, pop3, target="exc",synapse=syn2_3).connect_all_to_all(1.0)
proj_plast = Projection(pop1, proj2_3, target="exc",synapse=syn_plast).connect_all_to_all(1.0)

compile()
simulate(10.0)

 

 

Thank you so so much for your time. 

 

Best, 

Emma 

julien...@gmail.com

unread,
Aug 1, 2024, 7:23:47 AM8/1/24
to ANNarchy
Hi Emma,

indeed, it is only possible to use populations when creating a projection. 

Assuming that pop1 and pop2 have the same size, you could create a one-to-one projection between pop1 and pop2 with a weight of 1, in order to save a copy (with a delay of dt, though) of pop1's firing rate in pop2, and use that variable in the learning rule:

pop2_neuron = ann.Neuron(
equations="""
pop1_r = sum(copy)
tau* dr/dt + r = ...
"""
)

syn_plast = ann.Synapse(equations="dw/dt = lr * pre.r * pre.pop1_r")

proj_pop1_pop2 = ann.Projection(pop1, pop2, 'copy').connect_one_to_one(weights=1.0)

Best regards
Julien Vitay

Emma

unread,
Aug 9, 2024, 9:59:57 AM8/9/24
to ANNarchy
Thank you Julien, that worked perfectly :-) 
Reply all
Reply to author
Forward
0 new messages