Errors in Bijector power

20 views
Skip to first unread message

Siva Rajesh

unread,
Mar 7, 2021, 11:51:00 AM3/7/21
to TensorFlow Probability
I am exploring the bijector api power. I have the following example in mind. Say, X ~ N(0,1) i.e. X is sampled from a standard normal distribution. What is the distributional properties of X^3. 

This is the minimum reproducible example. 

################################
import tensorflow as tf
import tensorflow_probability as tfp
import numpy as np
tfd = tfp.distributions
tfb = tfp.bijectors

n = tfd.Normal(loc=0., scale=1.)
n_transformed = tfd.TransformedDistribution(n,bijector = tfb.Power(3))

print(n_transformed.quantile(0.10)) ## output -2.1047876 
print(n_transformed.quantile(0.90)) ## 2.1047876

print(n_transformed.cdf(-2.1047876)) ## Gives NaN as the output
print(n_transformed.cdf(2.104786)) ## 0.89999

print(n_transformed.prob(-2)) ## gives NaN
print(n_transformed.prob(2)) ## 0.037879337

#################################


Ideally, if the quantiles are correctly computed, we expect the corresponding the inverse i.e. cdfs also be computed correctly; however, it seems that specifically for negative numbers it gives NaN as the output. Similarly, for the density of computation, for positive inputs, the density is correctly computed; however, for negative inputs, it gives NaN as the output. 

Exploring further, I think it is due to the tf.math.pow function, which is used. Specifically, 
it is due to the fact that tf.math.pow gives NaN for all negative numbers raised to fractional numbers - e.g.  tf.math.pow(-8.0, 1.0/ 3.0) gives NaN

I am not sure where to report this issue. Kindly let me know if there is a way to circumvent this issue to make my reproducible example work. 
Thanks in advance. 



Eugene Brevdo

unread,
Mar 7, 2021, 5:30:05 PM3/7/21
to Siva Rajesh, TensorFlow Probability
tf.math.pow will give a correct answer if you feed in a complex number:

tf.math.pow(-8.0+0j, 1.0/3.0)
<tf.Tensor: shape=(), dtype=complex128, numpy=(1+1.732050807568877j)>

(and obviously TF's Pow op is picking a complex branch; i'm guessing it's the line for theta=pi).  Either way the issue is probably deeper than just upcasting in a few places.

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/9ef43660-076d-4260-ace4-1640299eff10n%40tensorflow.org.
Reply all
Reply to author
Forward
0 new messages