How to do synaptic normalization with Synapses object?

227 views
Skip to first unread message

Marcel Beining

unread,
Aug 25, 2014, 9:30:18 AM8/25/14
to brians...@googlegroups.com
As the topic says, I have problems with Synapses object to introduce synaptic normalization (and many other things).

Something like networkoperation not work, at least the way I do it

w_tot = sum(syn.w.data)
syn.w[:,:] /= (w_tot / normalize_to_this_value)

or
syn.w.data /= (w_tot / normalize_to_this_value)

Post code can also not work, since I need to normalize the weights of all incoming synapses to one neuron and not normalize to all weights to all neurons....

Any idea?

Marcel Beining

unread,
Aug 26, 2014, 9:26:52 AM8/26/14
to brians...@googlegroups.com
If you're interested:
in the meantime I suceeded in doing it this way:

@network_operation(clock=syn_clock)
def syn_norm():
    for j in xrange(0,number_postcells):
        sum_in = sum(synobj.w[:,j])
        if norm_factor - sum_in < 0 :     # if sum of all recurrent inputs is not zero and has changed
            synobj.w[:,j] = synobj.w[:,j] * norm_factor / sum_in
            print 'I normalized synapses of postsynaptic cell %d from %d nS to %d nS' %(j,sum_in*1e9,sum(synobj.w[:,j])*1e9)

Marcel Stimberg

unread,
Aug 26, 2014, 11:15:07 AM8/26/14
to brians...@googlegroups.com
Hi Marcel,

I think your approach is the best option you currently have, in Brian2
we are trying to make things like this possible without a network
operation and therefore more efficient.

In your network operation, you should be able to get a small speed gain
by using "synobj.w[:, j] *= ..." because this means only a single
indexing operation. Maybe it's worth doing some profiling, if this
network operation is slowing down your code a lot, it might be worth
trying to do the operation on the raw underlying array (i.e. using
synobj.w[:]), based on some pre-calculations beforehand (since the
connections are fixed).

Best,
Marcel

luke.y...@gmail.com

unread,
Nov 11, 2015, 5:51:25 AM11/11/15
to Brian
Hi,

I thought I'd post on here rather than opening a new thread.

I'm also trying to implement synaptic normalisation, and saw that you had planned to make it possible without a network operation. Is that feature available yet?

Best Wishes,
Luke

luke.y...@gmail.com

unread,
Nov 11, 2015, 1:59:33 PM11/11/15
to Brian, luke.y...@gmail.com
Again, if anyone is interested, I managed to implement without using a network operation so there is no significant slowing down. Can be achieved by having a variable sumw in the neurongroup model that gets updated by the (summed) instruction in the synapse model, then dividing the weight update by sumw_post in the pre and post-synaptic codes. If there is a problem with that approach then I haven't found it yet.

flora.bo...@gmail.com

unread,
Jun 19, 2017, 5:54:25 PM6/19/17
to Brian, luke.y...@gmail.com
Hello all,

In the end, is Luke's solution the best option for Brian2 ?

Thanks

Flora

Marcel Stimberg

unread,
Jun 20, 2017, 11:51:15 AM6/20/17
to brians...@googlegroups.com
Hi Flora,

yes, if you want to do the kind of continuous synaptic normalization
(i.e. your weights are changing during the simulation) that Luke
described, then using a "(summed)" variable is the best approach.

Best,

Marcel


Message has been deleted
Message has been deleted

flora.bo...@gmail.com

unread,
Jun 21, 2017, 11:26:05 AM6/21/17
to Brian
Hello Marcel

Thanks very much for answering.
I had wrote another question but in the end I deleted it because there was some initialization problem.

If it is not too time demanding for you, I just want to make sure I did write it the good way, here is my piece of code, I'm not sure about it (the things in bold)

  eqs_rcn = '''
  dS_rnd/dt = -S_rnd/RndSynapseTau      :1
  G_rnd = S_rnd + bias   :1 
  rate_rnd = 0.4*(1+tanh(fI_slope*G_rnd-3))/RndSynapseTau     :Hz
  sumw :1
  '''


  taupre = taupost = 20*ms
  Apre = 0.01
  Apost = -Apre*taupre/taupost*1.05
  wmin=0
  wmax=0.01

  stdp_rcn_rcn = '''
  w : 1
  dapre/dt = -apre/taupre : 1 (event-driven)
  dapost/dt = -apost/taupost : 1 (event-driven)
  sumw_post = w   : 1   (summed)
  '''

  # defines what happens when a presynaptic spike arrives at a synapse
  on_pre_rcn_rcn = '''
  S_rnd+=w
  apre += Apre
  w = clip(w+apost, wmin, wmax)
  w=w/sumw_post
  '''

  # code that should be executed whenever a postsynaptic spike occurs 
  on_post_rcn_rcn = '''
  apost += Apost
  w = clip(w+apre, wmin, wmax)
  w=w/sumw_post
  '''


with 
  RCN_pool = NeuronGroup(N_rnd, eqs_rcn, threshold='rand()<rate_rnd*dt')
  RCN_pool.S_rnd = 'InitSynapseRange*rand()'

and 

  Rec_RCN_RCN = Synapses(RCN_pool,RCN_pool, model=stdp_rcn_rcn, on_pre=on_pre_rcn_rcn, on_post= on_post_rcn_rcn)
  Rec_RCN_RCN.connect()

  Rec_RCN_RCN.w = 'rand() * wmax'


this RCN populations being connected to other 'sensory' populations without any learning.


Would you write it like that too ? I just want to normalize all the weights of all incoming synapses to one neuron. However if I had also some null weights, how would I write a condition in order to normalize only when the sum is non zero in on_pre and on_post pieces of code ?

Thank you very much

Best

Flora

Marcel Stimberg

unread,
Jun 22, 2017, 1:52:27 PM6/22/17
to brians...@googlegroups.com
Hi Flora,

just a quick general remark first: please don't delete questions. Quite
a few of us (me included) use the google group as a mailing list, for us
deleting old messages does not change anything (and it just makes things
more confusing).

Your code looks ok, you have to decide whether the details of the
normalization work for you. For example, in your case, normalization is
applied after the clipping, so you could end up with weights smaller
than w_min if w_min were > 0. Another option (I think this is what Luke
did in the original thread) is to not actually normalize the weights,
but only normalize their effect on the post-synaptic cell, i.e. use
S_rnd += w/sumw_post in on_pre, but not divide by sumw_post anywhere
else. This would also avoid an issue that you have with your current
code: the sumw_post that is used to normalize had been calculated
*before* the weight change, therefore the sum of all weights will not be
exactly normalized after a weight change.

The easiest way to avoid divisions by zero is probably to not divide by
sumw_post, but by something like clip(w+sumw_post, 1e-15, inf).

Hope that helps, best
Marcel

flora.bo...@gmail.com

unread,
Jun 22, 2017, 2:59:55 PM6/22/17
to Brian
Thank you so much and sorry for the deleting !
Reply all
Reply to author
Forward
0 new messages