Utility modeling - error during MCMC

20 views
Skip to first unread message

Surbhi Gupta

unread,
Feb 21, 2023, 7:05:46 PM2/21/23
to TensorFlow Probability
Please can somebody help me understand the reason for the error that I am getting:

@tf.function
def utilities(x, betas, errors):
    """
    `x * betas + errors` with broadcasting.
    """
    x = tf.cast(x, dtype=tf.float32)
    return tf.transpose(x) * betas + errors

k = 10
sigma_beta = 5.
sigma_error = 1.


def alt_pooled_model(X_train):
    return tfd.JointDistributionSequential([
        tfd.HalfCauchy(loc=0., scale=sigma_beta, name="sigma_beta"),
        tfd.HalfCauchy(loc=0., scale=sigma_error, name="sigma_error"),
        tfd.Normal(loc=tf.zeros(k), scale=sigma_beta, name="beta"),
        tfd.Gumbel(loc=0., scale=sigma_error, name="error"),
        lambda beta, error: tfd.Deterministic(
                tf.math.argmax(
                    tfd.Multinomial(
                        total_count=1,
                        logits=utilities(X_train, beta[..., tf.newaxis], error[..., tf.newaxis]),
                        #name="MNL"
                    ).sample(), axis=-1
                ),
                name="best_choices"
            ),
    ])


def target_log_prob(sigma_beta, sigma_error, beta, error):
    return alt_pooled_model(X_train).log_prob(sigma_beta=sigma_beta, sigma_error=sigma_error, beta=beta, error=error,
                        best_choices=best_choices)

# Use NUTS for inference
hmc = tfp.mcmc.NoUTurnSampler(
    target_log_prob_fn=target_log_prob,
    step_size=.01)

# Unconstrain the scale parameters, which must be positive
hmc = tfp.mcmc.TransformedTransitionKernel(
    inner_kernel=hmc,
    bijector=[
        tfp.bijectors.Identity(),  # sigma_beta
        tfp.bijectors.Identity(),  # sigma_error
        tfp.bijectors.Identity(),  # beta
        tfp.bijectors.Identity(),  # error
    ])

# Adapt the step size for 100 steps before burnin and main sampling
hmc = tfp.mcmc.DualAveragingStepSizeAdaptation(
    inner_kernel=hmc,
    num_adaptation_steps=100,
    target_accept_prob=.75)

# Initialize 10 chains using samples from the prior
joint_sample = alt_pooled_model(X_train).sample(10)
initial_state = [
    joint_sample[0],
    joint_sample[1],
    joint_sample[2],
    joint_sample[3],
]

# Compile with tf.function and XLA for improved runtime performance
@tf.function(autograph=False, experimental_compile=True)
def run():
    return tfp.mcmc.sample_chain(
      num_results=500,
      current_state=initial_state,
      kernel=hmc,
      num_burnin_steps=200,
      trace_fn=lambda _, kr: kr)

samples, traces = run()
print('R-hat diagnostics: ', tfp.mcmc.potential_scale_reduction(samples))

ValueError: Inconsistent names: component with name "beta" was referred to by a different name "error".


Please help

Surbhi Gupta

unread,
Feb 21, 2023, 7:10:56 PM2/21/23
to TensorFlow Probability, Surbhi Gupta
shape of X_train is (60000, 10) and best_choices is (60000, 1)

rif

unread,
Feb 21, 2023, 7:12:08 PM2/21/23
to Surbhi Gupta, TensorFlow Probability
I think the issue is that when you use a lambda in `JointDistributionSequential`, the arguments to the lambda are most recent first (see the example here, particularly the line with two parameter lambda computing m from n and g). So I think you need to flip the order of the 'beta' and 'error' arguments to your lambda.

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/c4b22fe6-7813-4822-8940-bd78946e38a4n%40tensorflow.org.

Pavel Sountsov

unread,
Feb 21, 2023, 7:12:27 PM2/21/23
to Surbhi Gupta, TensorFlow Probability
The arguments to the lambda that is constructing the 'best_choices' distribution. They should be reversed.

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.

Surbhi Gupta

unread,
Feb 21, 2023, 7:41:54 PM2/21/23
to TensorFlow Probability, si...@google.com, TensorFlow Probability, Surbhi Gupta

Thanks a ton, the solves this error but running into dimensions error, debugging

Surbhi Gupta

unread,
Feb 21, 2023, 7:58:58 PM2/21/23
to TensorFlow Probability, Surbhi Gupta
I am running into this error now - 

ValueError: Dimensions must be equal, but are 60000 and 10 for '{{node mcmc_sample_chain/dual_averaging_step_size_adaptation___init__/_bootstrap_results/transformed_kernel_bootstrap_results/NoUTurnSampler/.bootstrap_results/process_args/maybe_call_fn_and_grads/value_and_gradients/value_and_gradient/JointDistributionSequential/log_prob/best_choices/log_prob/sub}} = Sub[T=DT_INT64](mcmc_sample_chain/dual_averaging_step_size_adaptation___init__/_bootstrap_results/transformed_kernel_bootstrap_results/NoUTurnSampler/.bootstrap_results/process_args/maybe_call_fn_and_grads/value_and_gradients/value_and_gradient/Const_4, mcmc_sample_chain/dual_averaging_step_size_adaptation___init__/_bootstrap_results/transformed_kernel_bootstrap_results/NoUTurnSampler/.bootstrap_results/process_args/maybe_call_fn_and_grads/value_and_gradients/value_and_gradient/JointDistributionSequential/log_prob/ArgMax)' with input shapes: [60000], [10,10].

I have tried changing the argmax axis to 0 and 1 as well, still get some dimension error 

On Tuesday, 21 February 2023 at 19:05:46 UTC-5 Surbhi Gupta wrote:
Reply all
Reply to author
Forward
0 new messages