RBF net layer?

1,301 views
Skip to first unread message

Steve O'Hagan

unread,
Jul 11, 2016, 10:50:51 AM7/11/16
to Keras-users

I'm wondering if it would be feasible to implement a radial basis function layer? I'm envisaging that this would be initialised via k-means (or similar clusering), but that centres and sigmas for the RBFs would then be trainable in such a way that RBF layers and normal dense layers could be stacked together and trained as a whole.

Any thoughts?

Cheers,
Steve.

Daniel Hurwitz

unread,
Nov 30, 2016, 5:14:42 AM11/30/16
to Keras-users
Hi Steve,
I would like to try out a RBF neural network in Keras, but it is proving very difficult for me to understand and implement.
Have you made any progress in implementing a RBFN in Keras or can you share any code you have?
Cheers!

Stephen O'hagan

unread,
Nov 30, 2016, 11:03:34 AM11/30/16
to Daniel Hurwitz, Keras-users
Hi,

I was unable to get a multi-node RBF layer to work ☹

I did manage to get one RBF node per layer – output dim = 1, and using a merge layer, kind of faked the multi-node layer. Probably not the most elegant solution… partial code follows:

<code>
class RBFLayer(Layer):
def __init__(self, output_dim, **kwargs):
assert output_dim == 1
self.output_dim = output_dim
super(RBFLayer, self).__init__(**kwargs)

def build(self,input_shape):
input_dim=input_shape[1]
initCentres=np.random.random((self.output_dim,input_dim))
initBetas=np.random.random()
self.C = K.variable(initCentres)
self.B = K.variable(initBetas)
self.trainable_weights = [self.B,self.C]

def call(self,x,mask=None):
n=K.shape(x)
z=x-K.squeeze(K.repeat(self.C,n[0]),axis=0)
z2=K.square(z)
zs=K.sum(z2,axis=1,keepdims=False)
zb=-zs/K.square(self.B)
ze=K.exp(zb)
return K.reshape(ze,(-1,self.output_dim))

def get_output_shape_for(self, input_shape):
assert input_shape and len(input_shape) == 2
return (input_shape[0], self.output_dim)
</code>

and later, something like:

<code>
nodeCount = 8

inputs=Input(shape=(2,))

nodeList=[]
rbfNodes=[]
for i in range(nodeCount):
rbf=RBFLayer(1,name="RBFL"+str(i))
rbfNodes.append(rbf)
rbf=rbf(inputs)
nodeList.append(rbf)

x = merge(nodeList,mode='concat')

x = Dense(6,name="D1",activation="relu")(x)

x = Dense(6,name="D2",activation="relu")(x)

x=Dense(2,name="LMOut")(x)
predictions=Activation('softmax',name="PredictP")(x)

model=Model(input=inputs, output=predictions)
</code>

Cheers,
Steve.
--
You received this message because you are subscribed to a topic in the Google Groups "Keras-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/keras-users/3RjvOhcnXg4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/daa0e8ad-08cf-405a-8101-0feb59d33c86%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

x.maa...@gmail.com

unread,
Jun 15, 2017, 5:37:44 PM6/15/17
to Keras-users
Dear Steve, 

Did you manage to get a working multi-node RBFLayer so far? If so, please share your knowledge :). Also I was wondering how I should compile and fit your single-node RBF layer. I wish Keras was more versatile in the way different types of networks could be implemented =/.

Kind regards, 
Henry

Op maandag 11 juli 2016 16:50:51 UTC+2 schreef Steve O'Hagan:

Steve O'Hagan

unread,
Jun 17, 2017, 2:02:21 PM6/17/17
to x.maa...@gmail.com, Keras-users
I have not looked at this for a while but I could not see how to get multiple nodes per layer to work.

I ended up defining a layer that restricted to a single output, and concatenated a number of these to get the multi-node layer behaviour - seems a bit of a kludge though.

I'd need to update the code to work with the latest version of Keras since it doesn't work as it is.

I'm sure there must be a much better way of coding it though.

Cheers,
Steve.
--
You received this message because you are subscribed to a topic in the Google Groups "Keras-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/keras-users/3RjvOhcnXg4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to keras-users...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.


-- 
====================
Dr. Steve O'Hagan,
Computer Officer,
Bioanalytical Sciences Group,
School of Chemistry,
Manchester Institute of Biotechnology,
University of Manchester,
131, Princess St, MANCHESTER M1 7DN.

Email: SOH...@manchester.ac.uk
Phone: 0161 306 4562 

zach....@gmail.com

unread,
Mar 28, 2019, 2:04:58 PM3/28/19
to Keras-users
I'm jumping into this discussion almost 3 years after the fact, but this stack overflow answer helped me a lot: https://stackoverflow.com/a/53867101

from keras.layers import Layer
from keras import backend as K

class RBFLayer(Layer):
    def __init__(self, units, gamma, **kwargs):
        super(RBFLayer, self).__init__(**kwargs)
        self.units = units
        self.gamma = K.cast_to_floatx(gamma)

    def build(self, input_shape):
        self.mu = self.add_weight(name='mu',
                                  shape=(int(input_shape[1]), self.units),
                                  initializer='uniform',
                                  trainable=True)
        super(RBFLayer, self).build(input_shape)

    def call(self, inputs):
        diff = K.expand_dims(inputs) - self.mu
        l2 = K.sum(K.pow(diff,2), axis=1)
        res = K.exp(-1 * self.gamma * l2)
        return res

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.units)

Reply all
Reply to author
Forward
0 new messages