API𝞡
Extended Kalman filter (EKF)𝞡
ekf.dense_fisher
applies an online Bayesian update based on a Taylor approximation of the log-likelihood. Uses the empirical Fisher information matrix as a positive-definite alternative to the Hessian. Natural gradient descent equivalence following Ollivier, 2019.ekf.diag_fisher
same asekf.dense_fisher
but uses the diagonal of the empirical Fisher information matrix instead.
Laplace approximation𝞡
laplace.dense_fisher
calculates the empirical Fisher information matrix and uses it to approximate the posterior precision, i.e. a Laplace approximation.laplace.dense_ggn
calculates the Generalised Gauss-Newton matrix which is equivalent to the non-empirical Fisher in most neural network settings - see Martens, 2020.laplace.dense_hessian
calculates the Hessian of the negative log posterior.laplace.diag_fisher
same aslaplace.dense_fisher
but uses the diagonal of the empirical Fisher information matrix instead.laplace.diag_ggn
same aslaplace.dense_ggn
but uses the diagonal of the Generalised Gauss-Newton matrix instead.
All Laplace transforms leave the parameters unmodified. Comprehensive details on Laplace approximations can be found in Daxberger et al, 2021.
Stochastic gradient Markov chain Monte Carlo (SGMCMC)𝞡
sgmcmc.sgld
implements stochastic gradient Langevin dynamics (SGLD) from Welling and Teh, 2011.sgmcmc.sghmc
implements the stochastic gradient Hamiltonian Monte Carlo (SGHMC) algorithm from Chen et al, 2014 (without momenta resampling).sgmcmc.sgnht
implements the stochastic gradient Nosé-Hoover thermostat (SGNHT) algorithm from Ding et al, 2014, (SGHMC with adaptive friction coefficient).
For an overview and unifying framework for SGMCMC methods, see Ma et al, 2015.
Variational inference (VI)𝞡
vi.dense
implements a Gaussian variational distribution. Expects atorchopt
optimizer for handling the minimization of the NELBO. Also findvi.dense.nelbo
for simply calculating the NELBO with respect to alog_posterior
and Gaussian distribution.vi.diag
same asvi.dense
but uses the diagonal of the Gaussian variational distribution.
A review of variational inference can be found in Blei et al, 2017.
Optim𝞡
optim
wrapper fortorch.optim
optimizers within the unifiedposteriors
API that allows for easy swapping with UQ methods.