API𝞡
Extended Kalman filter (EKF)𝞡
ekf.diag_fisher
applies an online Bayesian update based on a Taylor approximation of the log-likelihood. Uses the diagonal empirical Fisher information matrix as a positive-definite alternative to the Hessian. Natural gradient descent equivalence following Ollivier, 2019.
Laplace approximation𝞡
laplace.dense_fisher
calculates the empirical Fisher information matrix and uses it to approximate the posterior precision, i.e. a Laplace approximation.laplace.dense_ggn
calculates the Generalised Gauss-Newton matrix which is equivalent to the non-empirical Fisher in most neural network settings - see Martens, 2020.laplace.diag_fisher
same aslaplace.dense_fisher
but uses the diagonal of the empirical Fisher information matrix instead.laplace.diag_ggn
same aslaplace.dense_ggn
but uses the diagonal of the Generalised Gauss-Newton matrix instead.
All Laplace transforms leave the parameters unmodified. Comprehensive details on Laplace approximations can be found in Daxberger et al, 2021.
Stochastic gradient Markov chain Monte Carlo (SGMCMC)𝞡
sgmcmc.sgld
implements stochastic gradient Langevin dynamics (SGLD) from Welling and Teh, 2011.sgmcmc.sghmc
implements the stochastic gradient Hamiltonian Monte Carlo (SGHMC) algorithm from Chen et al, 2014 (without momenta resampling).sgmcmc.sgnht
implements the stochastic gradient Nosé-Hoover thermostat (SGNHT) algorithm from Ding et al, 2014, (SGHMC with adaptive friction coefficient).
For an overview and unifying framework for SGMCMC methods, see Ma et al, 2015.
Variational inference (VI)𝞡
vi.diag
implements a diagonal Gaussian variational distribution. Expects atorchopt
optimizer for handling the minimization of the NELBO. Also findvi.diag.nelbo
for simply calculating the NELBO with respect to alog_posterior
and diagonal Gaussian distribution.
A review of variational inference can be found in Blei et al, 2017.
Optim𝞡
optim
wrapper fortorch.optim
optimizers within the unifiedposteriors
API that allows for easy swapping with UQ methods.