Skip to content

API𝞡

Extended Kalman filter (EKF)𝞡

  • ekf.dense_fisher applies an online Bayesian update based on a Taylor approximation of the log-likelihood. Uses the empirical Fisher information matrix as a positive-definite alternative to the Hessian. Natural gradient descent equivalence following Ollivier, 2019.
  • ekf.diag_fisher same as ekf.dense_fisher but uses the diagonal of the empirical Fisher information matrix instead.

Laplace approximation𝞡

All Laplace transforms leave the parameters unmodified. Comprehensive details on Laplace approximations can be found in Daxberger et al, 2021.

Stochastic gradient Markov chain Monte Carlo (SGMCMC)𝞡

For an overview and unifying framework for SGMCMC methods, see Ma et al, 2015.

Variational inference (VI)𝞡

  • vi.dense implements a Gaussian variational distribution. Expects a torchopt optimizer for handling the minimization of the NELBO. Also find vi.dense.nelbo for simply calculating the NELBO with respect to a log_posterior and Gaussian distribution.
  • vi.diag same as vi.dense but uses the diagonal of the Gaussian variational distribution.

A review of variational inference can be found in Blei et al, 2017.

Optim𝞡

  • optim wrapper for torch.optim optimizers within the unified posteriors API that allows for easy swapping with UQ methods.

TorchOpt𝞡

  • torchopt wrapper for torchopt optimizers within the unified posteriors API that allows for easy swapping with UQ methods.