3.3 Introduction to Recursive Bayesian Filtering of Probability Density Functions
A discrete dynamic process will be defined as a process where the current state of the system is dependent on one or more prior states. In continuous processes, the dependence of the current state on previous states is captured within a differential equation. When observations occur at discrete times, estimation conditioned on those observations can only occur at those time, so the differential equation is replaced by its finite difference equivalent that links the state at observation time tn to states at observation times prior to tn.
A first-order Markov process is one in which the current state is dependent only on the previous state. Thus, we can characterize a discrete random Markov dynamic process as
where xn is the state of the system (state vector) at time tn, fn−1 is a deterministic transition function (matrix) that moves the state x from time tn−1 to time tn and un is a known (usually deterministic) control that constitutes some external input that drives the system dynamics.
Although the white noise η (not necessarily Gaussian) can start at the input and be transformed by the transition function, it is usually assumed that the noise is additive and represents those parts of the true transition function that are not modeled. Note that the Markov process given above is only a model ...
Get Bayesian Estimation and Tracking: A Practical Guide now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.