This is the first part of a three-part article examining methods for Bayesian estimation and tracking. This first part presents the general theory of Bayesian estimation where we show that Bayesian estimation methods can be divided into two very general classes: the first class where the observation conditioned posterior densities are propagated in time through a predictor/corrector method; and the second class where the first two moments are propagated in time, with state and observation moment prediction steps followed by state moment update steps that use the latest observations. We show how the moment propagation method leads to very general linear and extended Kalman filters that are applicable to non-Gaussian densities that meet several restrictions.
In Part 2 of this article we will show that for Gaussian densities, with the expansion of all nonlinear functions in polynomials, the moment propagation method leads to several well-known Kalman filter methods. In Part 3, we will show that approximating a density by a set of Monte Carlo samples leads to particle filter methods, where the posterior density is propagated in time and moment integrals are approximated by sample moments. WIREs Comput Stat 2012 doi: 10.1002/wics.1211
For further resources related to this article, please visit the WIREs website