Time Series, 2nd Edition

Book description

This is the second edition of a popular graduate level textbook on time series modeling, computation and inference. The book is essentially unique in its approach, with a focus on Bayesian methods, although classical methods are also covered.

Table of contents

  1. Cover
  2. Half Title
  3. Series Page
  4. Title Page
  5. Copyright Page
  6. Contents (1/2)
  7. Contents (2/2)
  8. Preface
  9. Authors
  10. 1. Notation, definitions, and basic inference
    1. 1.1. Problem Areas and Objectives
    2. 1.2. Stochastic Processes and Stationarity
    3. 1.3. Autocorrelation and Cross-correlation
    4. 1.4. Smoothing and Differencing
    5. 1.5. A Primer on Likelihood and Bayesian Inference
      1. 1.5.1. ML, MAP, and LS Estimation
      2. 1.5.2. Traditional Least Squares
      3. 1.5.3. Full Bayesian Analysis
        1. 1.5.3.1. Reference Bayesian Analysis
        2. 1.5.3.2. Conjugate Bayesian Analysis
      4. 1.5.4. Nonconjugate Bayesian Analysis
      5. 1.5.5. Posterior Sampling
        1. 1.5.5.1. The Metropolis-Hastings Algorithm
        2. 1.5.5.2. Gibbs Sampling
        3. 1.5.5.3. Convergence
    6. 1.6. Appendix
      1. 1.6.1. The Uniform Distribution
      2. 1.6.2. The Univariate Normal Distribution
      3. 1.6.3. The Multivariate Normal Distribution
      4. 1.6.4. The Gamma and Inverse-gamma Distributions
      5. 1.6.5. The Exponential Distribution
      6. 1.6.6. The Chi-square Distribution
      7. 1.6.7. The Inverse Chi-square Distributions
      8. 1.6.8. The Univariate Student-t Distribution
      9. 1.6.9. The Multivariate Student-t Distribution
    7. 1.7. Problems
  11. 2. Traditional time domain models
    1. 2.1. Structure of Autoregressions
      1. 2.1.1. Stationarity in AR Processes
      2. 2.1.2. State-Space Representation of an AR(p)
      3. 2.1.3. Characterization of AR(2) Processes
      4. 2.1.4. Autocorrelation Structure of an AR(p)
      5. 2.1.5. The Partial Autocorrelation Function
    2. 2.2. Forecasting
    3. 2.3. Estimation in AR Models
      1. 2.3.1. Yule-Walker and Maximum Likelihood
      2. 2.3.2. Basic Bayesian Inference for AR Models
      3. 2.3.3. Simulation of Posterior Distributions
      4. 2.3.4. Order Assessment
      5. 2.3.5. Initial values and Missing Data
      6. 2.3.6. Imputing Initial Values via Simulation
    4. 2.4. Further Issues in Bayesian Inference for AR Models
      1. 2.4.1. Sensitivity to the Choice of Prior Distributions
        1. 2.4.1.1. Analysis Based on Normal Priors
        2. 2.4.1.2. Discrete Normal Mixture Prior and Subset Models
      2. 2.4.2. Alternative Prior Distributions (1/2)
      3. 2.4.2. Alternative Prior Distributions (2/2)
        1. 2.4.2.1. Scale-mixtures and Smoothness Priors
        2. 2.4.2.2. Priors Based on AR Latent Structure
    5. 2.5. Autoregressive Moving Average Models (ARMA)
      1. 2.5.1. Structure of ARMA Models
      2. 2.5.2. Autocorrelation and Partial Autocorrelation Functions
      3. 2.5.3. Inversion of AR Components
      4. 2.5.4. Forecasting and Estimation of ARMA Processes (1/2)
      5. 2.5.4. Forecasting and Estimation of ARMA Processes (2/2)
        1. 2.5.4.1. Forecasting ARMA Models
        2. 2.5.4.2. MLE and Least Squares Estimation
        3. 2.5.4.3. State-space Representation
        4. 2.5.4.4. Bayesian Estimation of ARMA Processes
    6. 2.6. Other Models
    7. 2.7. Appendix
      1. 2.7.1. The Reversible Jump MCMC Algorithm
      2. 2.7.2. The Binomial Distribution
      3. 2.7.3. The Beta Distribution
      4. 2.7.4. The Dirichlet Distribution
      5. 2.7.5. The Beta-binomial Distribution
    8. 2.8. Problems
  12. 3. The frequency domain
    1. 3.1. Harmonic Regression
      1. 3.1.1. The One-component Model
        1. 3.1.1.1. Reference Analysis
      2. 3.1.2. The Periodogram
      3. 3.1.3. Some Data Analyses
      4. 3.1.4. Several Uncertain Frequency Components
      5. 3.1.5. Harmonic Component Models of Known Period
      6. 3.1.6. The Periodogram (revisited)
    2. 3.2. Some Spectral Theory
      1. 3.2.1. Spectral Representation of a Time Series Process
      2. 3.2.2. Representation of Autocorrelation Functions
      3. 3.2.3. Other Facts and Examples
      4. 3.2.4. Traditional Nonparametric Spectral Analysis
    3. 3.3. Discussion and Extensions
      1. 3.3.1. Long Memory Time Series Models
    4. 3.4. Appendix
      1. 3.4.1. The F Distribution
      2. 3.4.2. Distributions of Quadratic Forms
      3. 3.4.3. Orthogonality of Harmonics
      4. 3.4.4. Complex Valued Random Variables
      5. 3.4.5. Orthogonal Increments Processes
        1. 3.4.5.1. Real-valued Orthogonal Increments Processes
        2. 3.4.5.2. Complex-valued Orthogonal Increments Processes
    5. 3.5. Problems
  13. 4. Dynamic linear models
    1. 4.1. General Linear Model Structures
    2. 4.2. Forecast Functions and Model Forms
      1. 4.2.1. Superposition of Models
      2. 4.2.2. Time Series Models
    3. 4.3. Inference in DLMs: Basic Normal Theory
      1. 4.3.1. Sequential Updating: Filtering
      2. 4.3.2. Learning a Constant Observation Variance
      3. 4.3.3. Missing and Unequally Spaced Data
      4. 4.3.4. Forecasting
      5. 4.3.5. Retrospective Updating: Smoothing
      6. 4.3.6. Discounting for DLM State Evolution Variances
      7. 4.3.7. Stochastic Variances and Discount Learning (1/2)
      8. 4.3.7. Stochastic Variances and Discount Learning (2/2)
        1. 4.3.7.1. References and additional comments
      9. 4.3.8. Intervention, Monitoring, and Model Performance
        1. 4.3.8.1. Intervention
        2. 4.3.8.2. Model monitoring and performance
    4. 4.4. Extensions: Non-Gaussian and Nonlinear Models
    5. 4.5. Posterior Simulation: MCMC Algorithms
      1. 4.5.1. Examples
    6. 4.6. Problems (1/2)
    7. 4.6. Problems (2/2)
  14. 5. State-space TVAR models
    1. 5.1. Time-Varying Autoregressions and Decompositions
      1. 5.1.1. Basic DLM Decomposition
      2. 5.1.2. Latent Structure in TVAR Models
        1. 5.1.2.1. Decompositions for standard autoregressions
        2. 5.1.2.2. Decompositions in the TVAR case
      3. 5.1.3. Interpreting Latent TVAR Structure
    2. 5.2. TVAR Model Speci cation and Posterior Inference (1/2)
    3. 5.2. TVAR Model Speci cation and Posterior Inference (2/2)
    4. 5.3. Extensions
    5. 5.4. Problems
  15. 6. SMC methods for state-space models
    1. 6.1. General State-Space Models
    2. 6.2. Posterior Simulation: Sequential Monte Carlo
      1. 6.2.1. Sequential Importance Sampling and Resampling
      2. 6.2.2. The Auxiliary Particle Filter
      3. 6.2.3. SMC for Combined State and Parameter Estimation (1/2)
      4. 6.2.3. SMC for Combined State and Parameter Estimation (2/2)
        1. 6.2.3.1. Algorithm of Liu and West
        2. 6.2.3.2. Storvik's algorithm
        3. 6.2.3.3. Practical ltering
        4. 6.2.3.4. Particle learning methods
      5. 6.2.4. Smoothing
      6. 6.2.5. Examples (1/2)
      7. 6.2.5. Examples (2/2)
    3. 6.3. Problems
  16. 7. Mixture models in time series
    1. 7.1. Markov Switching Models
      1. 7.1.1. Parameter Estimation
      2. 7.1.2. Other Models
    2. 7.2. Multiprocess Models
      1. 7.2.1. Definitions and Examples
      2. 7.2.2. Posterior Inference (1/2)
      3. 7.2.2. Posterior Inference (2/2)
        1. 7.2.2.1. Posterior inference in class I models
        2. 7.2.2.2. Posterior inference in class II models
    3. 7.3. Mixtures of General State-Space Models
    4. 7.4. Case Study: Detecting Fatigue from EEGs
      1. 7.4.1. Structured Priors in Multi-AR Models
      2. 7.4.2. Posterior Inference (1/2)
      3. 7.4.2. Posterior Inference (2/2)
    5. 7.5. Univariate Stochastic Volatility models
      1. 7.5.1. Zero-Mean AR(1) SV Model
      2. 7.5.2. Normal Mixture Approximation
      3. 7.5.3. Centered Parameterization
      4. 7.5.4. MCMC Analysis
      5. 7.5.5. Further Comments
    6. 7.6. Problems
  17. 8. Topics and examples in multiple time series
    1. 8.1. Multichannel Modeling of EEG Data
      1. 8.1.1. Multiple Univariate TVAR Models
      2. 8.1.2. A Simple Factor Model
    2. 8.2. Some Spectral Theory
      1. 8.2.1. The Cross-Spectrum and Cross-Periodogram
    3. 8.3. Dynamic Lag/Lead Models
    4. 8.4. Other Approaches
    5. 8.5. Problems
  18. 9. Vector AR and ARMA models
    1. 9.1. Vector Autoregressive Models
      1. 9.1.1. State-Space Representation of a VAR Process
      2. 9.1.2. The Moving Average Representation of a VAR Process
      3. 9.1.3. VAR Time Series Decompositions
    2. 9.2. Vector ARMA Models
      1. 9.2.1. Autocovariances and Cross-covariances
      2. 9.2.2. Partial Autoregression Matrix Function
      3. 9.2.3. VAR(1) and DLM Representations
    3. 9.3. Estimation in VARMA
      1. 9.3.1. Identifiability
      2. 9.3.2. Least Squares Estimation
      3. 9.3.3. Maximum Likelihood Estimation
        1. 9.3.3.1. Conditional likelihood
        2. 9.3.3.2. Exact likelihood
    4. 9.4. Bayesian VAR, TV-VAR, and DDNMs
    5. 9.5. Mixtures of VAR Processes
    6. 9.6. PARCOR Representations and Spectral Analysis
      1. 9.6.1. Spectral Matrix of a VAR and VARMA processes
    7. 9.7. Problems
  19. 10. General classes of multivariate dynamic models
    1. 10.1. Theory of Multivariate and Matrix Normal DLMs
      1. 10.1.1. Multivariate Normal DLMs
      2. 10.1.2. Matrix Normal DLMs and Exchangeable Time Series
    2. 10.2. Multivariate DLMs and Exchangeable Time Series
      1. 10.2.1. Sequential Updating
      2. 10.2.2. Forecasting and Retrospective Smoothing
    3. 10.3. Learning Cross-Series Covariances
      1. 10.3.1. Sequential Updating
      2. 10.3.2. Forecasting and Retrospective Smoothing
    4. 10.4. Time-Varying Covariance Matrices
      1. 10.4.1. Introductory Discussion
      2. 10.4.2. Wishart Matrix Discounting Models
      3. 10.4.3. Matrix Beta Evolution Model
      4. 10.4.4. DLM Extension and Sequential Updating
      5. 10.4.5. Retrospective Analysis
      6. 10.4.6. Financial Time Series Volatility Example (1/2)
      7. 10.4.6. Financial Time Series Volatility Example (2/2)
        1. 10.4.6.1. Data and model
        2. 10.4.6.2. Trajectories of multivariate stochastic volatility
        3. 10.4.6.3. Time-varying principal components analysis
        4. 10.4.6.4. Latent components in multivariate volatility
      8. 10.4.7. Short-term Forecasting for Portfolio Decisions (1/3)
      9. 10.4.7. Short-term Forecasting for Portfolio Decisions (2/3)
      10. 10.4.7. Short-term Forecasting for Portfolio Decisions (3/3)
        1. 10.4.7.1. Additional comments and extensions
      11. 10.4.8. Beta-Bartlett Wishart Models for Stochastic Volatility
        1. 10.4.8.1. Discount model variants
        2. 10.4.8.2. Additional comments and current research areas
    5. 10.5. Multivariate Dynamic Graphical Models
      1. 10.5.1. Gaussian Graphical Models
      2. 10.5.2. Dynamic Graphical Models
    6. 10.6. Selected recent developments
      1. 10.6.1. Simultaneous Graphical Dynamic Models
      2. 10.6.2. Models for Multivariate Time Series of Counts
      3. 10.6.3. Models for Flows on Dynamic Networks
      4. 10.6.4. Dynamic Multiscale Models
    7. 10.7. Appendix
      1. 10.7.1. The Matrix Normal Distribution
      2. 10.7.2. The Wishart Distribution
      3. 10.7.3. The Inverse Wishart Distribution
        1. 10.7.3.1. Point estimates of variance matrices
      4. 10.7.4. The Normal, Inverse Wishart Distribution
      5. 10.7.5. The Matrix Normal, Inverse Wishart Distribution
      6. 10.7.6. Hyper-Inverse Wishart Distributions
        1. 10.7.6.1. Decomposable graphical models
        2. 10.7.6.2. The hyper-inverse Wishart distribution
        3. 10.7.6.3. Prior and posterior HIW distributions
        4. 10.7.6.4. Normal, hyper-inverse Wishart distributions
    8. 10.8. Problems
  20. 11. Latent factor models
    1. 11.1. Introduction
    2. 11.2. Static Factor Models
      1. 11.2.1. 1-Factor Case
      2. 11.2.2. MCMC for Factor Models with One Factor
      3. 11.2.3. Example: A 1-Factor Model for Temperature
      4. 11.2.4. Factor Models with Multiple Factors
      5. 11.2.5. MCMC for the k-Factor Model
      6. 11.2.6. Selection of Number of Factors
      7. 11.2.7. Example: A k-Factor Model for Temperature
    3. 11.3. Multivariate Dynamic Latent Factor Models
      1. 11.3.1. Example: A Dynamic 3-Factor Model for Temperature (1/2)
      2. 11.3.1. Example: A Dynamic 3-Factor Model for Temperature (2/2)
    4. 11.4. Factor Stochastic Volatility
      1. 11.4.1. Computations
      2. 11.4.2. Factor Stochastic Volatility Model for Exchange Rates (1/2)
      3. 11.4.2. Factor Stochastic Volatility Model for Exchange Rates (2/2)
    5. 11.5. Spatiotemporal Dynamic Factor Models
      1. 11.5.1. Example: Temperature Over the Eastern USA (1/2)
      2. 11.5.1. Example: Temperature Over the Eastern USA (2/2)
    6. 11.6. Other Extensions and Recent Developments
    7. 11.7. Problems
  21. Bibliography (1/6)
  22. Bibliography (2/6)
  23. Bibliography (3/6)
  24. Bibliography (4/6)
  25. Bibliography (5/6)
  26. Bibliography (6/6)
  27. Author Index (1/2)
  28. Author Index (2/2)
  29. Subject Index (1/2)
  30. Subject Index (2/2)

Product information

  • Title: Time Series, 2nd Edition
  • Author(s): Raquel Prado, Marco R. Ferreira, Mike West
  • Release date: July 2021
  • Publisher(s): Chapman and Hall/CRC
  • ISBN: 9781498747042