# Specify this STAN in tfp

108 views

### Anil Kumar

May 11, 2020, 11:22:21 PM5/11/20
to TensorFlow Probability
Here N is 1000.
rv1 is a 1000 element vector drawn from an Normal distribution with mean=100.0 and sd = 10.0
rv2 is a 1000 element vector drawn from an Normal distribution with mean=200.0 and sd = 05.0

Specifically, I want to estimate the hyperparameters mean and sd of the two distributions given the random variates rv1, rv2.

`functions{  }data{  int N;  row_vector[N] rv1;  row_vector[N] rv2;}parameters{  real m1;  real <lower=0.001> s1;  real m2;  real <lower=0.001> s2;  row_vector[N] samp1;  row_vector[N] samp2;}  transformed parameters{  real m1_t;  real m2_t;  real <lower=0.001> s1_t;  real <lower=0.001> s2_t;  m1_t = m1*1180.0;  m2_t = m2*2180.0;  s1_t = s1* 100.0;  s2_t = s2* 100.0;}model{  m1 ~ std_normal();  m2 ~ std_normal();  s1 ~ std_normal();  s2 ~ std_normal();  rv1 ~ normal(m1_t, s1_t);  rv2 ~ normal(m2_t, s2_t);}`

### Christopher Suter

May 12, 2020, 6:11:45 PM5/12/20
to Anil Kumar, TensorFlow Probability
Hi Anil, here's some code that expresses this model in TFP, along with inference. I think the step sizes need more tuning because I'm seeing pretty bad r-hat values, but this should be enough to get you up and running:

```
import tensorflow as tf
import tensorflow_probability as tfp
tfd = tfp.distributions
N = 1000

actual_rv1 = tfd.Normal(100., 10.).sample(N)
actual_rv2 = tfd.Normal(100., 5.).sample(N)

joint = tfd.JointDistributionNamed(dict(
m1 = tfd.Normal(0., 1.),
m2 = tfd.Normal(0., 1.),
s1 = tfd.LogNormal(0., 1.),
s2 = tfd.LogNormal(0., 1.),

rv1 = lambda m1, s1: tfd.Sample(tfd.Normal(1180. * m1, 100. * s1),
sample_shape=[N]),
rv2 = lambda m2, s2: tfd.Sample(tfd.Normal(2180. * m2, 100. * s2),
sample_shape=[N]),
))

def target_log_prob(m1, m2, s1, s2):
return joint.log_prob(m1=m1, m2=m2, s1=s1, s2=s2,
rv1=actual_rv1, rv2=actual_rv2)

# Use NUTS for inference
hmc = tfp.mcmc.NoUTurnSampler(
target_log_prob_fn=target_log_prob,
step_size=.01)

# Unconstrain the scale parameters, which must be positive
hmc = tfp.mcmc.TransformedTransitionKernel(
inner_kernel=hmc,
bijector=[
tfp.bijectors.Identity(),  # m1
tfp.bijectors.Identity(),  # m2
tfp.bijectors.Softplus(),  # s1
tfp.bijectors.Softplus(),  # s2
])

# Adapt the step size for 100 steps before burnin and main sampling
inner_kernel=hmc,
target_accept_prob=.75)

# Initialize 10 chains using samples from the prior
joint_sample = joint.sample(10)
initial_state = [
joint_sample['m1'],
joint_sample['m2'],
joint_sample['s1'],
joint_sample['s2'],
]

# Compile with tf.function and XLA for improved runtime performance
@tf.function(autograph=False, experimental_compile=True)
def run():
return tfp.mcmc.sample_chain(
num_results=500,
current_state=initial_state,
kernel=hmc,
num_burnin_steps=200,
trace_fn=lambda _, kr: kr)

samples, traces = run()
print('R-hat diagnostics: ', tfp.mcmc.potential_scale_reduction(samples))
```

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/97bc9c8b-0b9b-41fe-bb96-a137c42bf73a%40tensorflow.org.

### Anil Kumar

May 13, 2020, 12:20:13 PM5/13/20
to TensorFlow Probability, anil...@gmail.com
Dear Christopher,

I tuned the HMC (not NUTS) a bit and getting acceptable results.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprob...@tensorflow.org.

### Christopher Suter

May 13, 2020, 12:21:50 PM5/13/20
to Anil Kumar, TensorFlow Probability
Great, cheers!

To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/cc816e29-3de5-431c-a47e-40992649b355%40tensorflow.org.