Hi Chris,
Thanks so much. To simplify things, I'm starting with a more basic network regression. I attempted the probit data augmentation approach using your suggested construction of Y,Z with a simple network probit regression, and Nimble does now detect conjugacy (although not for z, which should have truncated normal full conditional), and giving marginal improvement on the RW sampler .
However, when I add random effects, I can fit it using the RW sampler, but using the conjugate sampler I'm getting the following error:
warning: logProb of data node y[26, 14]: logProb is -Inf.
Do you have any suggestions what could be causing the issue? My code is below and I'm happy to share the data via email if there's no obvious error.
## define the model
glmmCode2 <- nimbleCode({
#beta0 ~ dnorm(0, sd = 1) # Drop intercept
beta1 ~ dnorm(0, sd = 1)
sigma_RE ~ dunif(0, 10)
# Model
for(i in 1:N){ # Random effects for each vertex
beta2[i] ~ dnorm(0, sd = sigma_RE)
}
for (i in 2:N) { # Likelihood
y[i,j] ~ dinterval(z[i,j], 0)
z[i,j] ~ dnorm(beta1*x[i,j] + beta2[i] + beta2[j], 1)
}
}
})
## constants, data, and initial values
glmmConsts2 <- list(N = nC)
glmmData2 <- list(
y = A,
x = outer(reg,reg, FUN = "==")*1 # x_ij = 1(region_i == region_j)
)
glmmInits2 <- list(beta1 = 0, beta2 = rep(0,nC))
glmmModel2 <- nimbleModel(code = glmmCode2, constants = glmmConsts2, data = glmmData2,
inits = glmmInits2)
configureMCMC(glmmModel2, print = TRUE)
niter <- 2000
nchains <- 2
mcmc.glmm.out2 <- nimbleMCMC(code = glmmCode2, constants = glmmConsts2,
data = glmmData2, inits = glmmInits2,
nchains = nchains, niter = niter,
summary = TRUE, WAIC = TRUE,
monitors = c('beta1', 'beta2'))
Best,
Jennifer