Custom Layer - loop over input samples

1,669 views
Skip to first unread message

Steve O'Hagan

unread,
Jul 28, 2016, 11:44:05 AM7/28/16
to Keras-users
Hi,

In designing a custom layer, in the Call(x) method, is it possible to apply a Keras function over all input (batch) samples?

Also, I'm not clear what is the purpose of K.Placeholder?

Cheers,
Steve.
Message has been deleted

jcavali...@gmail.com

unread,
Jul 30, 2016, 9:36:15 AM7/30/16
to Keras-users
K.Placeholder is to create a symbolic tensor in either theano or tensorflow which does not have a value initially until its compiled into c code and a value is passed in.

I think you could get access to the entire training set in a layer if you do something like the following assuming you have a function with following signature
Xtrain,ytrain = build_training_data()

now you can define the call method like so:
call(self,x):
Xtrain,ytrain = build_training_data()
Xtensor=K.variable(Xtrain,)

#now you can use the full training set in Xtensor,ytensor

output
=F(x,Xtensor)
return output

where F() is some function of the input to the call method x and the full training set Xtensor. You need to use K.variable here so you can instantiate a symbolic tensor with an actual value--Xtrain.

I think this should work. 

Stephen O'hagan

unread,
Aug 1, 2016, 6:45:14 AM8/1/16
to jcavali...@gmail.com, Keras-users

Hi,

 

Thanks for this, not quite what I need, but useful info.

 

I assume that in call(self,x), x will contain the current batch of inputs – I’m fine with that.

 

I have a function that will process a row of x – i.e. it’s input is a vector; I haven’t been able to caste this in a form that will take the full matrix/ tensor , x.

 

In raw theano, I believe there’s a “scan” function which iterate over rows of x, and build the rows of the output tensor, but this is not exposed by the Keras backend. I assume it is possible to use T.scan if I give up TensorFlow compatibility.

 

It’s quite possible that there’s a keras way of doing this that I have missed.

 

Cheers,

Steve.

--
You received this message because you are subscribed to a topic in the Google Groups "Keras-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/keras-users/o4VJ7-2sqoA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/8a2b36ba-a710-465f-9da7-c42753ccc37b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

jcavali...@gmail.com

unread,
Aug 2, 2016, 10:59:28 AM8/2/16
to Keras-users
yes if you switch backends you can just import theano and get access to the scan method but you may not need that If you could clarify something for me:

This is the call method for a dense layer. After dotting the input x with weight W, output contains a matrix with B rows and K columns for batch size B and number of nodes in current layer K so that after applying the activation function each row contains the K activation values for that particular sample within the batch. 

so as you can see the list of standard activation functions like relu,linear, sigmoid, tanh all take the full B x K matrix but if B=1 then these functions would accept just the vector. So could you clarify what your functions needs to do that is different from this process. 

def call(self, x, mask=None):
       output = K.dot(x, self.W)
       if self.bias:
           output += self.b
       return self.activation(output)




On Thursday, July 28, 2016 at 11:44:05 AM UTC-4, Steve O'Hagan wrote:

Stephen O'hagan

unread,
Aug 2, 2016, 11:58:21 AM8/2/16
to jcavali...@gmail.com, Keras-users

Hi,

 

Assuming X is shaped (samples,2), but in general the second dim could be some number other than 2.

 

I’m trying to make a custom layer that has ‘N’ nodes; each node has a centre vector, Vc, dimension 2, and a scalar factor S, with some transformation function, f.

Each node has its own Vc and S, both set as learnable. I suppose each node within the layer has its own activation function.

 

Each node will generate a scalar output, Q = f(S*||x-Vc||2),thus, for all ‘N’ nodes, where x represents a row of X, it will generate a vector, length N, [Q0,Q1… Qn]

 

I have not been able to work out how to get this in tensor form over all samples such that Qout is shaped (samples, N) – what make this hard (for me to get my head round!) is that it needs to be in terms of the symbolic tensor rather than just a particular concrete array.

 

I’m hoping that I’d be able to build the tensor for Vout one row at a time via theano.scan (since I know how to do one row!)

 

Cheers,

Steve.

 

From: keras...@googlegroups.com [mailto:keras...@googlegroups.com] On Behalf Of jcavali...@gmail.com
Sent: 02 August 2016 15:59
To: Keras-users <keras...@googlegroups.com>
Subject: Re: Custom Layer - loop over input samples

 

yes if you switch backends you can just import theano and get access to the scan method but you may not need that If you could clarify something for me:

--

You received this message because you are subscribed to a topic in the Google Groups "Keras-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/keras-users/o4VJ7-2sqoA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to keras-users...@googlegroups.com.

swapnil....@gmail.com

unread,
Dec 28, 2017, 2:14:18 PM12/28/17
to Keras-users
Hi

Were you able to solve this issue using tensorflow / keras backend?

Reply all
Reply to author
Forward
0 new messages