Skip to content

API𝞡

Extended Kalman filter (EKF)𝞡

  • ekf.diag_fisher applies an online Bayesian update based on a Taylor approximation of the log-likelihood. Uses the diagonal empirical Fisher information matrix as a positive-definite alternative to the Hessian. Natural gradient descent equivalence following Ollivier, 2019.

Laplace approximation𝞡

  • laplace.dense_fisher calculates the empirical Fisher information matrix and uses it to approximate the posterior precision, i.e. a Laplace approximation.
  • laplace.dense_ggn calculates the Generalised Gauss-Newton matrix which is equivalent to the non-empirical Fisher in most neural network settings - see Martens, 2020.
  • laplace.diag_fisher same as laplace.dense_fisher but uses the diagonal of the empirical Fisher information matrix instead.
  • laplace.diag_ggn same as laplace.dense_ggn but uses the diagonal of the Generalised Gauss-Newton matrix instead.

All Laplace transforms leave the parameters unmodified. Comprehensive details on Laplace approximations can be found in Daxberger et al, 2021.

Stochastic gradient Markov chain Monte Carlo (SGMCMC)𝞡

For an overview and unifying framework for SGMCMC methods, see Ma et al, 2015.

Variational inference (VI)𝞡

  • vi.diag implements a diagonal Gaussian variational distribution. Expects a torchopt optimizer for handling the minimization of the NELBO. Also find vi.diag.nelbo for simply calculating the NELBO with respect to a log_posterior and diagonal Gaussian distribution.

A review of variational inference can be found in Blei et al, 2017.

Optim𝞡

  • optim wrapper for torch.optim optimizers within the unified posteriors API that allows for easy swapping with UQ methods.

TorchOpt𝞡

  • torchopt wrapper for torchopt optimizers within the unified posteriors API that allows for easy swapping with UQ methods.