Computing Negative Log Likelihood with Log Determinate Terms:

17 views
Skip to first unread message

James Ferguson

unread,
Dec 2, 2025, 6:38:07 PMDec 2
to gtsam users
For a Gaussian factor, the overall negative log likelihood requires both Maholonobis error and the log-determinate of covariance matrix. optimizer.error() gives you the mahal part, but not the logdet, which is almost always fine. But if you are calibrating noise parameters from a data set, you need both parts, since noise models will be variable.

I couldn't find a built in way to do this, so I wrote this code here, which I think is correct. If anyone needs an example or wants to implement this properly (e.g. in GaussianFactor), here you go...

double TrackerCalibrator::compute_nll() const {
    double nll = 0.0;
    for (auto& factor : graph_) {
        auto noise_model_factor = std::dynamic_pointer_cast<gtsam::NoiseModelFactor>(factor);
        if (!noise_model_factor) {
            std::cout << "Cast from NonlinearFactor to NoiseModelFactor failed!" << std::endl;
            continue;
        }

        auto base_model = noise_model_factor->noiseModel();

        auto gaussian_model = std::dynamic_pointer_cast<gtsam::noiseModel::Gaussian>(base_model);
        if (!gaussian_model) {
            std::cout << "Cast from NoiseModel to Gaussian failed!" << std::endl;
            continue;   // Robust loss model?
        }

        double mahal = noise_model_factor->error(values_);  // 0.5 * whitened^T whitened

        Matrix cov = gaussian_model->covariance();
        int d = gaussian_model->dim();

        Eigen::LLT<Matrix> llt(cov);
        if (llt.info() != Eigen::Success) {
            std::cerr << "Covariance LLT failed!\n";
            continue;
        }

        Matrix6 L = llt.matrixL();

        double log_det = 0.0;
        for (int i = 0; i < d; i++)
            log_det += std::log(L(i, i));

        nll += mahal + log_det;
    }

    return nll;
}

 
Reply all
Reply to author
Forward
0 new messages