Deep Learning in Time Series Analysis

Book description

The concept of deep machine learning is easier to understand by paying attention to the cyclic stochastic time series and a time series whose content is non-stationary not only within the cycles, but also over the cycles as the cycle-to-cycle variations.

Table of contents

  1. Cover Page
  2. Title Page
  3. Copyright Page
  4. Dedication
  5. Foreword
  6. Preface
    1. Book Focus
    2. Book Readership
    3. Contributions
  7. Contents
  8. Contributors
  9. Part I Fundamentals of Learning
    1. 1 Introduction to Learning
      1. 1.1 Artificial Intelligence
      2. 1.2 Data and Signal Definition
      3. 1.3 Data Versus Signal
      4. 1.4 Signal Models
      5. 1.5 Noise and Interference
      6. 1.6 Time Series Definition
      7. 1.7 Time Series Analysis
      8. 1.8 Deep Learning and Time Series Analysis
      9. 1.9 Organisation of the Book
    2. 2 Learning Theory
      1. 2.1 Learning and Adaptation
      2. 2.2 Learning in a Practical Example
      3. 2.3 Mathematical View to Learning
        1. 2.3.1 Training and Validation Data
        2. 2.3.2 Training Method
        3. 2.3.3 Training Parameters
        4. 2.3.4 Hyperparameters
      4. 2.4 Learning Phases
      5. 2.5 Training, Validation, and Test
      6. 2.6 Learning Schemes
        1. 2.6.1 Supervised-Static Learning
        2. 2.6.2 Supervised-Dynamic Learning
        3. 2.6.3 Unsupervised-Static Learning
        4. 2.6.4 Unsupervised-Dynamic Learning
      7. 2.7 Training Criteria
      8. 2.8 Optimization, Training, and Learning
      9. 2.9 Evaluation of Learning Performance
        1. 2.9.1 Structural Risk
        2. 2.9.2 Empirical Risk
        3. 2.9.3 Overfitting and Underfitting Risk
        4. 2.9.4 Learning Capacity
      10. 2.10 Validation
        1. 2.10.1 Repeated Random Sub Sampling (RRSS)
        2. 2.10.2 K-Fold Validation
        3. 2.10.3 A-Test Validation
      11. 2.11 Privileges of A-Test Method
        1. 2.11.1 A-Test and Structural Risk
        2. 2.11.2 A-Test and Leaning Capacity
        3. 2.11.3 A-Test vs other Methods
      12. 2.12 Large and Small Training Data
    3. 3 Pre-processing and Visualisation
      1. 3.1 Dimension Reduction
        1. 3.1.1 Feature Selection
        2. 3.1.2 Linear Transformation
      2. 3.2 Supervised Mapping
        1. 3.2.1 K-Nearest Neighbours (KNN)
        2. 3.2.2 Perceptron Neural Network
        3. 3.2.3 Multi-layer Perceptron Neural Networks (MLP)
      3. 3.3 Unsupervised Mapping
        1. 3.3.1 K-Means Clustering
        2. 3.3.2 Self-Organizing Map (SOM)
        3. 3.3.3 Hierarchical Clustering
  10. Part II Essentials of Time Series Analysis
    1. 4 Basics of Time Series
      1. 4.1 Introduction to Time Series Analysis
      2. 4.2 Deterministic, Chaotic and Stochastic
      3. 4.3 Stochastic Behaviors of Time Series
        1. 4.3.1 Cyclic Time Series
        2. 4.3.2 Partially Cyclic Time Series
      4. 4.4 Time Series Prediction
      5. 4.5 Time Series Classification
    2. 5 Multi-Layer Perceptron (MLP) Neural Networks for Time Series Classification
      1. 5.1 Time-Delayed Neural Network (TDNN)
      2. 5.2 Time-Growing Neural Network (TGNN)
      3. 5.3 Forward, Backward and Bilateral Time-Growing Window
      4. 5.4 Privileges of Time-Growing Neural Network
        1. 5.4.1 TGNN includes MLP in its architecture
        2. 5.4.2 TGNN can include TDNN in its structure
        3. 5.4.3 TGNN is optimal in learning the first window
    3. 6 Dynamic Models for Sequential Data Analysis
      1. 6.1 Dynamic Time Warping (Structural Classification)
      2. 6.2 Hidden Markov Model (Statistical Classification)
        1. 6.2.1 Model-based analysis
        2. 6.2.2 Essentials of Hidden Markov Model (HMM)
        3. 6.2.3 Problem statement and implementation
        4. 6.2.4 Time series analysis and HMM
      3. 6.3 Recurrent Neural Network
  11. Part III Deep Learning Approaches to Time Series Classification
    1. 7 Clustering for Learning at Deep Level
      1. 7.1 Clustering as a Tool for Deep Learning
      2. 7.2 Modified K-Means Method
      3. 7.3 Modified Fuzzy C-Means
      4. 7.4 Discriminant Analysis
      5. 7.5 Cluster-Based vs Discriminant Analysis Methods
      6. 7.6 Combined Methods
    2. 8 Deep Time Growing Neural Network
      1. 8.1 Basic Architecture
      2. 8.2 Learning at the Deep Level
        1. 8.2.1 Learning the growing centre
        2. 8.2.2 Learning the deep elements
      3. 8.3 Surface Learning
    3. 9 Deep Learning of Cyclic Time Series
      1. 9.1 Time Growing Neural Network
      2. 9.2 Growing-Time Support Vector Machine
      3. 9.3 Distance-Based Learning
      4. 9.4 Optimization
    4. 10 Hybrid Method for Cyclic Time Series
      1. 10.1 Learning Deep Contents
      2. 10.2 Cyclic Learning
      3. 10.3 Classification
    5. 11 Recurrent Neural Networks (RNN)
      1. 11.1 Introduction
      2. 11.2 Structure of Recurrent Neural Networks
      3. 11.3 Unfolding the Network in Time
      4. 11.4 Backpropagation Through Time
      5. 11.5 The Challenge of Long-term Dependencies
      6. 11.6 Long-Short Term Memory (LSTM)
      7. 11.7 Other Recurrent Networks
        1. 11.7.1 Unfolding outputs at all steps
        2. 11.7.2 Gated recurrent networks
        3. 11.7.3 Echo state networks
    6. 12 Convolutional Neural Networks (CNN)
      1. 12.1 Introduction
      2. 12.2 Architecture Overview
      3. 12.3 Convolutional Layer
      4. 12.4 Pooling Layer
      5. 12.5 Learning of CNN
      6. 12.6 Recurrent CNN
  12. Bibliography
  13. Index

Product information

  • Title: Deep Learning in Time Series Analysis
  • Author(s): Arash Gharehbaghi
  • Release date: July 2023
  • Publisher(s): CRC Press
  • ISBN: 9781000911435