call(self,x):
Xtrain,ytrain = build_training_data()
Xtensor=K.variable(Xtrain,)
#now you can use the full training set in Xtensor,ytensor
output =F(x,Xtensor)
return outputHi,
Thanks for this, not quite what I need, but useful info.
I assume that in call(self,x), x will contain the current batch of inputs – I’m fine with that.
I have a function that will process a row of x – i.e. it’s input is a vector; I haven’t been able to caste this in a form that will take the full matrix/ tensor , x.
In raw theano, I believe there’s a “scan” function which iterate over rows of x, and build the rows of the output tensor, but this is not exposed by the Keras backend. I assume it is possible to use T.scan if I give up TensorFlow compatibility.
It’s quite possible that there’s a keras way of doing this that I have missed.
Cheers,
Steve.
--
You received this message because you are subscribed to a topic in the Google Groups "Keras-users" group.
To unsubscribe from this topic, visit
https://groups.google.com/d/topic/keras-users/o4VJ7-2sqoA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to
keras-users...@googlegroups.com.
To view this discussion on the web, visit
https://groups.google.com/d/msgid/keras-users/8a2b36ba-a710-465f-9da7-c42753ccc37b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
def call(self, x, mask=None): output = K.dot(x, self.W) if self.bias: output += self.b return self.activation(output)Hi,
Assuming X is shaped (samples,2), but in general the second dim could be some number other than 2.
I’m trying to make a custom layer that has ‘N’ nodes; each node has a centre vector, Vc, dimension 2, and a scalar factor S, with some transformation function, f.
Each node has its own Vc and S, both set as learnable. I suppose each node within the layer has its own activation function.
Each node will generate a scalar output, Q = f(S*||x-Vc||2),thus, for all ‘N’ nodes, where x represents a row of X, it will generate a vector, length N, [Q0,Q1… Qn]
I have not been able to work out how to get this in tensor form over all samples such that Qout is shaped (samples, N) – what make this hard (for me to get my head round!) is that it needs to be in terms of the symbolic tensor rather than just a particular concrete array.
I’m hoping that I’d be able to build the tensor for Vout one row at a time via theano.scan (since I know how to do one row!)
Cheers,
Steve.
From: keras...@googlegroups.com [mailto:keras...@googlegroups.com]
On Behalf Of jcavali...@gmail.com
Sent: 02 August 2016 15:59
To: Keras-users <keras...@googlegroups.com>
Subject: Re: Custom Layer - loop over input samples
yes if you switch backends you can just import theano and get access to the scan method but you may not need that If you could clarify something for me:
--
You received this message because you are subscribed to a topic in the Google Groups "Keras-users" group.
To unsubscribe from this topic, visit
https://groups.google.com/d/topic/keras-users/o4VJ7-2sqoA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to
keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/9de9ef58-2cf4-43ca-a031-8723fe9de26e%40googlegroups.com.
Were you able to solve this issue using tensorflow / keras backend?