1Stationary Processes and Time Series
1.1 Introduction
Forecasting the evolution of a man‐made system or a natural phenomenon is one of the most ancient problems of human kind. We develop here a prediction theory under the assumption that the variable under study can be considered as stationary process. The theory is easy to understand and simple to apply. Moreover, it lends itself to various generalizations, enabling to deal with nonstationary signals.
The organization is as follows. After an introduction to the prediction problem (Section 1.2), we concisely review the notions of random variable, random vector, and random (or stochastic) process in Sections 1.3–1.5, respectively. This leads to the definition of white process (Section 1.6), a key notion in the subsequent developments. The readers who are familiar with random concepts can skip Sections 1.3–1.5.
Then we introduce the moving average (MA) process and the autoregressive (AR) process (Sections 1.7 and 1.8). By combining them, we come to the family of autoregressive and moving average (ARMA) processes (Section 1.10). This is the family of stationary processes we focus on in this volume.
For such processes, in Chapter 3, we develop a prediction theory, thanks to which we can easily work out the optimal forecast given the model.
In our presentation, we make use of elementary concepts of linear dynamical systems such as transfer functions, poles, and zeros; the readers who are not familiar with such topics are cordially ...
Get Model Identification and Data Analysis now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.