Perpexing errors in vectorised GLM

37 views
Skip to first unread message

Bob O'Hara

unread,
Jan 8, 2026, 6:09:16 AMJan 8
to nimble-users
I'm trying to implement a weighted Bayesian lasso in Nimble for a binary response (the long version: I want a Bayesian version of MaxEnt so I can do some uncertainty propagation). I'm trying to adapt the final method here: <https://r-nimble.org/examples/linpred.html>, but I'm hitting error messages that I understand but that (to me) don't make sense.

Reproducible code is below. With the weighted function, I'm told "Error in chol.default(model$omega[1:11, 1:11]) : the leading minor of order 1 is not positive". If I set omega to be an identity matrix this doesn't happen, but if I make it a diagonal matrix with dgamma(N,N) (N large) priors I do get this error. This does not seem to be in the part of the model where the weights are, though.

I tried to make a minimal reproducible example by removing the weights, and got a different error: "bad parameters for distribution dbern_vec(p = prob[1:n]). (No available re-parameterization found.)". This doesn't seem to have anything to do with the weights: my guess is that with weights omega fails before we hit this error.

I'm sure this all does make sense, but any help would be gratefully received.

Bob

library(disdat)
library(nimble)

dbern_vec_wt <- nimbleFunction( ## Define the distribution
  run = function(x = double(1), p = double(1), wt = double(1), log = integer(0, default = 0)) { #
    returnType(double(0))
    logProbs <- wt*dbinom(x=x, size=1, prob=p, log = TRUE)
    logProb <- sum(logProbs)
    #    logProb <- sum(dbinom(x=x, size=1, prob=p, log = TRUE))
    if(log) return(logProb)
    else return(exp(logProb))
  })

dbern_vec <- nimbleFunction( ## Define the distribution
  run = function(x = double(1), p = double(1), log = integer(0, default = 0)) {
    returnType(double(0))
    logProbs <- dbinom(x=x, size=1, prob=p, log = TRUE)
    logProb <- sum(logProbs)
    #    logProb <- sum(dbinom(x=x, size=1, prob=p, log = TRUE))
    if(log) return(logProb)
    else return(exp(logProb))
  })

rbern_vec <- nimbleFunction( ## Define a simulation function, optionally.
  run = function(n = integer(0), p = double(1)) {
    returnType(double(1))
    if(n != 1) print("rbern_vec only allows n = 1; using n = 1.")
    smp <- rbinom(n, p, size=1)
    return(smp)
  })

Weightedcode <- nimbleCode({
  for(j in 1:k) {
    ratetau[j] <- MeanLambda
    tau2[j] ~ dexp(rate=ratetau[j])
  }
  omega[1:k, 1:k] <- diag(1/tau2[1:k])
  alpha ~ dnorm(0, sd = 1)
  beta[1:k] ~ dmnorm(zeros[1:k], omega[1:k, 1:k])
  linpred[1:n] <- alpha + X[1:n, 1:k] %*% beta[1:k]
  logit(prob[1:n]) <- linpred[1:n]
  y[1:n] ~ dbern_vec_wt(prob[1:n], W[1:n])
})

Code <- nimbleCode({
  for(j in 1:k) {
    ratetau[j] <- MeanLambda
    tau2[j] ~ dexp(rate=ratetau[j])
  }
  omega[1:k, 1:k] <- diag(1/tau2[1:k])
  alpha ~ dnorm(0, sd = 1)
  beta[1:k] ~ dmnorm(zeros[1:k], omega[1:k, 1:k])
  linpred[1:n] <- alpha + X[1:n, 1:k] %*% beta[1:k]
  logit(prob[1:n]) <- linpred[1:n]
  y[1:n] ~ dbern_vec(prob[1:n])
})


# Use one species from Canadian data
CAN <- disData("CAN")
EnvPO <- c("alt", "asp2", "ontprec", "ontprec4", "ontprecsd", "ontslp",
           "onttemp", "onttempsd", "onttmin4", "ontveg", "watdist")
Species <- "can02"

SpData <- rbind(CAN$po[CAN$po$spid==Species,c("occ", EnvPO)], CAN$bg[,c("occ", EnvPO)])
SpData[,EnvPO] <- apply(SpData[,EnvPO], 2, scale) # scale covariates

mm <- as.matrix(SpData[,EnvPO])
dd <- list(y = SpData$occ)

Params <- c("alpha", "beta")

Const <- list(n = nrow(mm), k = ncol(mm), MeanLambda = 1)
Data <- list(y = SpData$occ, X = mm, W=rep(1,nrow(mm)))
Inits <- list(alpha = 0, beta = rnorm(Const$k, 0, sd=0.01))


Model_wt <- nimbleModel(code = Weightedcode, data = Data, constants = Const, inits = Inits)

# Gives the following error:
# Error in chol.default(model$omega[1:11, 1:11]) :
#  the leading minor of order 1 is not positive

Model <- nimbleModel(code = Code, data = Data, constants = Const, inits = Inits)

# Gives the following error:
# Error: bad parameters for distribution dbern_vec(p = prob[1:n]). (No available re-parameterization found.)

Daniel Turek

unread,
Jan 8, 2026, 7:29:45 AMJan 8
to Bob O'Hara, nimble-users
Bob, good questions.  The error:

# Error in chol.default(model$omega[1:11, 1:11]) :
#  the leading minor of order 1 is not positive


is just a problem with initial values.  Since the dmnorm distribution (under the hood) finds the cholesky factorization of its precision/covariance argument, it needs a valid initial value for this matrix.  Your omega matrix is determined by the values in tau2, so what you need to do is provide valid (positive) initial values for the tau2 vector, and that will provide a valid initial value for the omega matrix, and that error goes away.  You can try, for example:

Inits <- list(alpha = 0, beta = rnorm(Const$k, 0, sd=0.01),
              tau2 = rep(1, Const$k))


For your second example (using nimbleModel(code = Code,...) ), I didn't get the same error you did.  In fact, when using the valid initial values (shown above), I didn't get any error at all from this line:

Model <- nimbleModel(code = Code, data = Data, constants = Const, inits = Inits)

Maybe I made a mistake running your code.  Are you able to try again, and maybe send another reproducible example of this error?

Cheers!
Daniel





--
You received this message because you are subscribed to the Google Groups "nimble-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nimble-users...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/nimble-users/0dc7eeb0-cd39-4dd4-8eda-fa0f0c6c28fbn%40googlegroups.com.

Bob O'Hara

unread,
Jan 9, 2026, 10:08:40 AMJan 9
to nimble-users
Wonderful, thanks. 

Restarting R solved the second problem, so that was at my end. 

Bob

Bob O'Hara

unread,
Jan 13, 2026, 8:28:52 AMJan 13
to nimble-users
Great. Now I've got another perplexing error...

Everything was working, and then I tried to parallelise the code. At some point during that process, I started getting the following error when compiling:

Error in match.call(definition, call, expand.dots, envir) :
  unused argument (wt = model$W[1:10740])

But the only place I have a wt argument is in the dbern_vec() function, and there it is used. The full code is below, but it wouldn't surprise me if the issue is not reproducible. Help?

Bob

library(disdat)
library(nimble)

dbern_vec <- nimbleFunction( ## Uses wt

  run = function(x = double(1), p = double(1), wt = double(1),
                 log = integer(0, default = 0)) {
    returnType(double(0))
    logProb <- sum(wt*dbinom(x=x, size=1, prob=p, log = TRUE)) # wt used here!!!!!

    if(log) return(logProb)
    else return(exp(logProb))
  })

rbern_vec <- nimbleFunction( 
  run = function(n = integer(0), p = double(1)) {
    returnType(double(1))
    if(n != 1) print("rbern_vec only allows n = 1; using n = 1.")
    smp <- rbinom(n, p, size=1)
    return(smp)
  })

MaxNetcode <- nimbleCode({
  for(j in 1:k) {
    ratetau[j] <- MeanLambda*reg[j]
    sigma2[j] ~ dexp(rate=ratetau[j])
  }
  omega[1:k, 1:k] <- diag(1/sigma2[1:k]) # precision

  alpha ~ dnorm(0, sd = 1)
  beta[1:k] ~ dmnorm(zeros[1:k], omega[1:k, 1:k])
  linpred[1:n] <- alpha + X[1:n, 1:k] %*% beta[1:k]
  logit(Prob[1:n]) <- linpred[1:n]
  y[1:n] ~ dbern_vec(p=Prob[1:n], wt=W[1:n]) # wt passed to function here
})



CAN <- disData("CAN")
EnvPO <- c("alt", "asp2", "ontprec", "ontprec4", "ontprecsd", "ontslp",
           "onttemp", "onttempsd", "onttmin4", "ontveg", "watdist")
Species <- "can02"

SpData <- rbind(CAN$po[CAN$po$spid==Species,c("occ", EnvPO)], CAN$bg[,c("occ", EnvPO)])
SpData[,EnvPO] <- apply(SpData[,EnvPO], 2, scale) # scale covariates


mm <- as.matrix(SpData[,EnvPO])
weights <- SpData$occ + (1 - SpData$occ) * 100

Const <- list(n = nrow(mm), k = ncol(mm), MeanLambda = 1,
              zeros=rep(0,ncol(mm)))
Data = list(y = SpData$occ, X = mm, W=rep(1,nrow(mm)), reg=rep(1, ncol(mm)))
Inits = list(alpha = rnorm(1, log(1/(1+1/mean(SpData$occ))), 1),
             beta = rnorm(Const$k, 0, sd=0.1), sigma2=rep(1, ncol(mm)))

Model <- nimbleModel(code = MaxNetcode, data = Data,
                     constants = Const, inits = Inits)

# This line gives the error
CmyModel <- compileNimble(Model, showCompilerOutput = TRUE)

Daniel Turek

unread,
Jan 13, 2026, 10:29:30 AMJan 13
to Bob O'Hara, nimble-users
Bob, I'm glad to hear the other changes worked out.

As for this issue, it looks like a quick fix in the argument specifications for the run functions of the dbern_vec and rbern_vec custom distribution functions.  Bottom line, even if the rbern_vec function doesn't ever make use of the "wt" argument, the arguments of the "d" and "r" functions for any custom distribution must still "agree" with each other.  And by "agree" with each other, we mean:

"d" density function arguments are exactly as: x, [CUSTOM ARGS HERE], log

"r" random generation function arguments are exactly as: n, [THE SAME CUSTOM ARGS HERE]

So, in your case, we only need to add the "wt" argument for the "r" function:

dbern_vec <- nimbleFunction( ## Uses wt
    run = function(x = double(1), p = double(1), wt = double(1),
                   log = integer(0, default = 0)) {
        returnType(double(0))
        logProb <- sum(wt*dbinom(x=x, size=1, prob=p, log = TRUE)) # wt used here!!!!!
        if(log) return(logProb)
        else return(exp(logProb))
    })

rbern_vec <- nimbleFunction(
    run = function(n = integer(0), p = double(1), wt = double(1)) {

        returnType(double(1))
        if(n != 1) print("rbern_vec only allows n = 1; using n = 1.")
        smp <- rbinom(n, p, size=1)
        return(smp)
    })


Let me know if this doesn't work, or you have any other questions.

Cheers,
Daniel



Bob O'Hara

unread,
Jan 16, 2026, 8:02:43 AMJan 16
to nimble-users
Thanks! Now it's working. Well, almost...

I think there are scoping issues that are causing me problems. I have two examples.

For the first, the code is below: when I try to parallelise it, I get the following error:

Error in checkForRemoteErrors(val) :
  2 nodes produced errors; first error: R function 'calc_dmnormAltParams' in the code 'calc_dmnormAltParams(lifted_chol_oPomega_oB1to11_comma_1to11_cB_cP[1:11, 1:11], 1, 0)' does not exist.

I can use clusterExport() to export functions I know, but nimble is evidently creating new functions that I would need to track down, so I'm hoping there's a better approach.

The second example is from me building this into a package (code here: https://github.com/oharar/BayesMaxEnt). It runs fine from the command line, but when I try to build a vignette, I get a similar error (see https://rpubs.com/oharar/AaaaaghMakeItWork, which is from the vignette in the package).

At least I'm making progress.

Bob

library(disdat)
library(nimble)

dbern_vec <- nimbleFunction( ## Define the distribution

  run = function(x = double(1), p = double(1), wt = double(1),
                 log = integer(0, default = 0)) {
    returnType(double(0))
    logProb <- sum(wt*dbinom(x=x, size=1, prob=p, log = TRUE))
    if(log) return(logProb)
    else return(exp(logProb))
  })

rbern_vec <- nimbleFunction( ## Define a simulation function, optionally.
  run = function(n = integer(0), p = double(1), wt = double(1)) {
    returnType(double(1))
    if(n != 1) print("rbern_vec only allows n = 1; using n = 1.")
    smp <- rbinom(n, p, size=1)
    return(smp)
  })

MaxNetcode <- nimbleCode({
  for(j in 1:k) {
    ratetau[j] <- MeanLambda*reg[j]
    sigma2[j] ~ dexp(rate=ratetau[j])
  }
  omega[1:k, 1:k] <- diag(1/sigma2[1:k]) # precision
  alpha ~ dnorm(0, sd = 1)
  beta[1:k] ~ dmnorm(zeros[1:k], omega[1:k, 1:k])
  linpred[1:n] <- alpha + X[1:n, 1:k] %*% beta[1:k]
  logit(Prob[1:n]) <- linpred[1:n]
  y[1:n] ~ dbern_vec(p=Prob[1:n], wt=W[1:n])
})

DoAll <- function(X=1, cod, dat, cons, init) {
  Model <- nimbleModel(code = cod, data = dat,
                       constants = cons, inits = init)
 
  CmyModel <- compileNimble(Model, showCompilerOutput = FALSE)
 
  ConfModel <- configureMCMC(CmyModel)
 
  Targets <- c("alpha", paste0("beta[", 1:Const$k, "]"))
  ConfModel$addMonitors(Targets)
 
  ConfModel$addSampler(target = c("alpha", "beta"), type = "RW_block",
                       control = list(adaptInterval = 5))
 
  myMCMC <- buildMCMC(ConfModel, monitors = Targets)
  CmyMCMC <- compileNimble(myMCMC)
 
  mcmc <- runMCMC(CmyMCMC, samplesAsCodaMCMC=TRUE, inits=init,
                  nchains=1, nburnin = 5, niter=10, thin=1)
  mcmc

 
}


CAN <- disData("CAN")
EnvPO <- c("alt", "asp2", "ontprec", "ontprec4", "ontprecsd", "ontslp",
           "onttemp", "onttempsd", "onttmin4", "ontveg", "watdist")
Species <- "can02"

SpData <- rbind(CAN$po[CAN$po$spid==Species,c("occ", EnvPO)], CAN$bg[,c("occ", EnvPO)])
SpData[,EnvPO] <- apply(SpData[,EnvPO], 2, scale) # scale covariates


mm <- as.matrix(SpData[,EnvPO])
weights <- SpData$occ + (1 - SpData$occ) * 100

Const <- list(n = nrow(mm), k = ncol(mm), MeanLambda = 1,
              zeros=rep(0,ncol(mm)))
Data = list(y = SpData$occ, X = mm, W=rep(1,nrow(mm)), reg=rep(1, ncol(mm)))
Inits = list(alpha = rnorm(1, log(1/(1+1/mean(SpData$occ))), 1),
             beta = rnorm(Const$k, 0, sd=0.1), sigma2=rep(1, ncol(mm)))


this_works <- lapply(1:2, DoAll,cod=MaxNetcode, dat=Data, cons=Const, init=Inits)

library(parallel)

a_cluster <- makeCluster(2)
clusterExport(cl = a_cluster, varlist = c("nimbleModel", "dbern_vec"))
this_doesnt_work <- parLapply(cl=a_cluster, 1:2, DoAll, cod=MaxNetcode, dat=Data, cons=Const, init=Inits)
stopCluster(a_cluster)

Chris Paciorek

unread,
Jan 16, 2026, 7:49:57 PMJan 16
to Bob O'Hara, nimble-users
Hi Bob,

For your parallelization case, we have some guidance in this nimble example. In particular, please put `library(nimble)` as the first line of code in `doAll` so the workers can all find various functions in the nimble namespace. You also need to `clusterExport` the `rbern_vec` function. And you have a stray global `Const` inside of `doAll` you'll need to fix.

As far as the vignette, presumably the trick is getting {r,d}bern_vec into the right environment. I think that needs to be in whatever the global environment is in the context of the vignette. We run into similar issues when using user-defined functions in `test_that` testing contexts and we usually need to do something like this:

assign('rbern_vec', rbern_vec, envir = .GlobalEnv)

So perhaps there is a way for you to put that line above in your vignette code in the appropriate place (hopefully in a hidden code chunk that is executed but not printed into the vignette output).

-chris



Reply all
Reply to author
Forward
0 new messages