Dear Julien,
I have one more question…
I want to implement a two-factor learning rule in a rate-based network. As an example I am using a network with three populations.
Pop2 and pop3 would be connected via proj2_3. The weight of proj2_3 should be plastic, depending on the spike rate of pop1, pop2 and some learning rate lr (dw/dt = pop1.r * pop2.r *lr).
My initial idea was to implement a projection with a plastic synapse model from pop1 to proj2_3. I’m inserting a (not working) code example below to illustrate. However, only populations seem to be accepted as projection pre and post.
I also considered somehow declaring the rate of pop1 globally accessible, but I am not sure if this would be possible, as it would have to be updated at every dt.
I also found an earlier question https://groups.google.com/u/3/g/annarchy/c/dO3BKs3dR-A/m/N1_uxGZRCQAJ. But due to the spiking nature of that network I can’t quite connect the dots.
I hope my goal is understandable.
Example:
lr = Constant('lr', 0.1)
input_pop1 = Population(1, Neuron(parameters="r=10.0"))
input_pop2 = Population(1, Neuron(parameters="r=5.0"))
pop1 = Population(1, LeakyIntegrator)
pop2 = Population(1, LeakyIntegrator)
pop3 = Population(1, LeakyIntegrator)
syn2_3 = Synapse(equations=""" psp = w * pre.r """) #default
syn_plast = Synapse(equations=""" dw / dt = pre.r * lr * post.psp """)
proj1 = Projection(input_pop1, pop1,target="exc").connect_all_to_all(1.0)
proj2 = Projection(input_pop2, pop2, target="exc").connect_all_to_all(1.0)
proj2_3 = Projection(pop2, pop3, target="exc",synapse=syn2_3).connect_all_to_all(1.0)
proj_plast = Projection(pop1, proj2_3, target="exc",synapse=syn_plast).connect_all_to_all(1.0)
compile()
simulate(10.0)
Thank you so so much for your time.
Best,
Emma