Building Keras if else like function

2,477 views
Skip to first unread message

jonstew...@gmail.com

unread,
Jan 18, 2018, 9:34:40 AM1/18/18
to Keras-users

I am trying to build a layer in keras that computes different functions for each input depending upon the value of each of those inputs. I have something that gives an general overview of the problem below, which throws an error because tensor objects cannot be used in boolean statements. So,

def conditionalfunction(x) #here, x is a keras tensor
    if x < .5:
        return tanh(x)
    else x < .5:
        return sigmoid(x)
   

That throws the error "Using a tf.tensor as a python bool is not allowed". I also tried another method, where I attempted to create a boolean tensor by creating a constant vector, and then using the keras backend function K.less_equal in order to compare the input values to that constant, which should result in a boolean Keras tensor, but seems to be throwing an error as well.

def conditionalfunction(layer1):
    a = layer1.shape() 

    threshold1 = K.constant(.25, dtype = K.dtype(a), shape = K.shape(a))
    q1 = K.less_equal(layer1, threshold1) #returns boolean
    q2 = Multiply()([q1, layer1])
    q3 = Relu(q2)
    #repeat above using K.greater

    return q1b

The error thrown for the above code seems to be coming from the K.constant line. The above attempts to do the same thing via building boolean tensors, then pointwise tensor multiplication between the original tensor and the boolean tensor, and then applying the activation function, then, once this is completed for each region of the if else statements, I would pointwise sum the resulting vectors together afterwards. My problem could simply be a programming error, as I am a bit new to Keras/Tensorflow. Or, I may not understand how the the Keras backend functions are to be used. Any pointers/code examples on obtaining what I want above are very much appreciated and if any further clarification is needed, let me know. Also, I realize what the above code does is unlikely to be performative, it's meant just as an example of what I am trying to do and the problem I am experiencing.Thanks!

camero...@gmail.com

unread,
Sep 15, 2018, 2:30:49 PM9/15/18
to Keras-users
Running into the same issue... any luck resolving this?

Sergey O.

unread,
Sep 15, 2018, 3:09:05 PM9/15/18
to camero...@gmail.com, Keras-users

But I feel like it might be better to use some kind of switch function (such as a sigmoid), to allow for a more smooth gradient (better for optimizer).
for example
def cond_switch(x):
  threshold = 0.5
  slope = 1.0
  switch = K.sigmoid((x-threshold)*slope)

  return switch*K.relu(x) + (1.0-switch)*K.tanh(x)


--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/f76e8bbe-bb99-443f-9ff5-2ef08017b050%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Ted Yu

unread,
Sep 15, 2018, 4:08:04 PM9/15/18
to Sergey Ovchinnikov, camero...@gmail.com, keras...@googlegroups.com
For the last line, shouldn't the relu be written as K.relu(x-0.5) (considering the original conditionalfunction) ?

Thanks

To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/CAO4rxgNicbfnF9z5bM02hR3HK76tx1xHHyMrhjqj33nY295zCQ%40mail.gmail.com.

Sergey O.

unread,
Sep 15, 2018, 4:42:02 PM9/15/18
to Ted Yu, camero...@gmail.com, Keras-users
That was just an example, you can use any functions (with any bias) that you want.

But if you are actually trying to implement tanh to relu transition, I think that's what "elu" effectively does.

To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.

jonstew...@gmail.com

unread,
Sep 17, 2018, 11:39:16 AM9/17/18
to Keras-users
Apologies for the delay in response. I ended up using something similar to sokrypton's answer. I couldn't think of/find a way to get an actual boolean result out of it, but thinking about it more, that wasn't a requirement for what I wanted. The other thing I tried was to explicitly build two different paths, each with a threshold function (sigmoid activation function, followed by RELU with X- desired threshold as input, or 1-x-threshold) to the tensor, which turned everything above or below that threshold, depending upon which path, to 0 or X.  Then those graph paths can be treated separately, with different activation functions or other differences, and recombined later on with a sum or other merge function in the DAG structure. They worked okay. Interesting to play around with. Thanks for the help!
Reply all
Reply to author
Forward
0 new messages