NormalInverseGaussian: ensuring that |tailweight| >= |skewness|

7 views
Skip to first unread message

Daniel Weitzenfeld

unread,
Mar 22, 2023, 10:02:24 PM3/22/23
to TensorFlow Probability
Hi,
I'm building a density network, and I'm considering using the NormalInverseGuassian. I am seeking recommendations for how to ensure that the parameters passed to the distribution are valid, specifically, that |tailweight| >= |skewness|.
My first idea is to define the loss such that the loss is an arbitrarily large number when the assumption is violated.  That doesn't feel robust though - I worry that at prediction time, there is nothing to prevent the model from passing invalid parameters to the distribution.  
My model will look something like the model I described here, though if I can figure out how to get the NormalInverseGaussian to work, I may not need a mixture distribution.
Thanks,
Dan  

Christopher Suter

unread,
Mar 22, 2023, 11:42:20 PM3/22/23
to Daniel Weitzenfeld, TensorFlow Probability
Don't train tailweight directly, train "pre_tailweight", unconstrained, then pass tailweight=|skewness| + tf.math.softplus(pre_tailweight)

Softplus ensures the thing we're adding is positive.

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/c372002c-a2bc-4f9e-a63b-08bb41dd1a6en%40tensorflow.org.

Daniel Weitzenfeld

unread,
Mar 23, 2023, 8:27:07 AM3/23/23
to Christopher Suter, TensorFlow Probability
Thank you Chris! I had a feeling it was something simple like this. 
Reply all
Reply to author
Forward
0 new messages