Advanced Deep Learning with Python

Book description

Gain expertise in advanced deep learning domains such as neural networks, meta-learning, graph neural networks, and memory augmented neural networks using the Python ecosystem

Key Features

  • Get to grips with building faster and more robust deep learning architectures
  • Investigate and train convolutional neural network (CNN) models with GPU-accelerated libraries such as TensorFlow and PyTorch
  • Apply deep neural networks (DNNs) to computer vision problems, NLP, and GANs

Book Description

In order to build robust deep learning systems, you'll need to understand everything from how neural networks work to training CNN models. In this book, you'll discover newly developed deep learning models, methodologies used in the domain, and their implementation based on areas of application.

You'll start by understanding the building blocks and the math behind neural networks, and then move on to CNNs and their advanced applications in computer vision. You'll also learn to apply the most popular CNN architectures in object detection and image segmentation. Further on, you'll focus on variational autoencoders and GANs. You'll then use neural networks to extract sophisticated vector representations of words, before going on to cover various types of recurrent networks, such as LSTM and GRU. You'll even explore the attention mechanism to process sequential data without the help of recurrent neural networks (RNNs). Later, you'll use graph neural networks for processing structured data, along with covering meta-learning, which allows you to train neural networks with fewer training samples. Finally, you'll understand how to apply deep learning to autonomous vehicles.

By the end of this book, you'll have mastered key deep learning concepts and the different applications of deep learning models in the real world.

What you will learn

  • Cover advanced and state-of-the-art neural network architectures
  • Understand the theory and math behind neural networks
  • Train DNNs and apply them to modern deep learning problems
  • Use CNNs for object detection and image segmentation
  • Implement generative adversarial networks (GANs) and variational autoencoders to generate new images
  • Solve natural language processing (NLP) tasks, such as machine translation, using sequence-to-sequence models
  • Understand DL techniques, such as meta-learning and graph neural networks

Who this book is for

This book is for data scientists, deep learning engineers and researchers, and AI developers who want to further their knowledge of deep learning and build innovative and unique deep learning projects. Anyone looking to get to grips with advanced use cases and methodologies adopted in the deep learning domain using real-world examples will also find this book useful. Basic understanding of deep learning concepts and working knowledge of the Python programming language is assumed.

Table of contents

  1. Title Page
  2. Copyright and Credits
    1. Advanced Deep Learning with Python
  3. About Packt
    1. Why subscribe?
  4. Contributors
    1. About the author
    2. About the reviewer
    3. Packt is searching for authors like you
  5. Preface
    1. Who this book is for
    2. What this book covers
    3. To get the most out of this book
      1. Download the example code files
      2. Download the color images
      3. Conventions used
    4. Get in touch
      1. Reviews
  6. Section 1: Core Concepts
  7. The Nuts and Bolts of Neural Networks
    1. The mathematical apparatus of NNs
      1. Linear algebra
        1. Vector and matrix operations
      2. Introduction to probability
        1. Probability and sets
        2. Conditional probability and the Bayes rule
        3. Random variables and probability distributions
          1. Probability distributions
        4. Information theory
      3. Differential calculus
    2. A short introduction to NNs
      1. Neurons
      2. Layers as operations
      3. NNs
      4. Activation functions
      5. The universal approximation theorem
    3. Training NNs
      1. Gradient descent
      2. Cost functions
      3. Backpropagation
      4. Weight initialization
      5. SGD improvements
    4. Summary
  8. Section 2: Computer Vision
  9. Understanding Convolutional Networks
    1. Understanding CNNs
      1. Types of convolutions
        1. Transposed convolutions
        2. 1×1 convolutions
        3. Depth-wise separable convolutions
        4. Dilated convolutions
      2. Improving the efficiency of CNNs
        1. Convolution as matrix multiplication
        2. Winograd convolutions
      3. Visualizing CNNs
        1. Guided backpropagation
        2. Gradient-weighted class activation mapping
      4. CNN regularization
    2. Introducing transfer learning
      1. Implementing transfer learning with PyTorch
      2. Transfer learning with TensorFlow 2.0
    3. Summary
  10. Advanced Convolutional Networks
    1. Introducing AlexNet
    2. An introduction to Visual Geometry Group
      1. VGG with PyTorch and TensorFlow
    3. Understanding residual networks
      1. Implementing residual blocks
    4. Understanding Inception networks
      1. Inception v1
      2. Inception v2 and v3
      3. Inception v4 and Inception-ResNet
    5. Introducing Xception
    6. Introducing MobileNet
    7. An introduction to DenseNets
    8. The workings of neural architecture search
    9. Introducing capsule networks
      1. The limitations of convolutional networks
      2. Capsules
        1. Dynamic routing
      3. The structure of the capsule network
    10. Summary
  11. Object Detection and Image Segmentation
    1. Introduction to object detection
      1. Approaches to object detection
      2. Object detection with YOLOv3
        1. A code example of YOLOv3 with OpenCV
      3. Object detection with Faster R-CNN
        1. Region proposal network
        2. Detection network
        3. Implementing Faster R-CNN with PyTorch
    2. Introducing image segmentation
      1. Semantic segmentation with U-Net
      2. Instance segmentation with Mask R-CNN
        1. Implementing Mask R-CNN with PyTorch
    3. Summary
  12. Generative Models
    1. Intuition and justification of generative models
    2. Introduction to VAEs
      1. Generating new MNIST digits with VAE
    3. Introduction to GANs
      1. Training GANs
        1. Training the discriminator
        2. Training the generator
      2. Putting it all together
      3. Problems with training GANs
    4. Types of GAN
      1. Deep Convolutional GAN
        1. Implementing DCGAN
      2. Conditional GAN
        1. Implementing CGAN
      3. Wasserstein GAN
        1. Implementing WGAN
      4. Image-to-image translation with CycleGAN
        1. Implementing CycleGAN
          1. Building the generator and discriminator
          2. Putting it all together
    5. Introducing artistic style transfer
    6. Summary
  13. Section 3: Natural Language and Sequence Processing
  14. Language Modeling
    1. Understanding n-grams
    2. Introducing neural language models
      1. Neural probabilistic language model
      2. Word2Vec
        1. CBOW
        2. Skip-gram
        3. fastText
      3. Global Vectors for Word Representation model
    3. Implementing language models
      1. Training the embedding model
      2. Visualizing embedding vectors
    4. Summary
  15. Understanding Recurrent Networks
    1. Introduction to RNNs
      1. RNN implementation and training
        1. Backpropagation through time
        2. Vanishing and exploding gradients
    2. Introducing long short-term memory
      1. Implementing LSTM
    3. Introducing gated recurrent units
      1. Implementing GRUs
    4. Implementing text classification
    5. Summary
  16. Sequence-to-Sequence Models and Attention
    1. Introducing seq2seq models
    2. Seq2seq with attention
      1. Bahdanau attention
      2. Luong attention
      3. General attention
      4. Implementing seq2seq with attention
        1. Implementing the encoder
        2. Implementing the decoder
        3. Implementing the decoder with attention
        4. Training and evaluation
    3. Understanding transformers
      1. The transformer attention
      2. The transformer model
      3. Implementing transformers
        1. Multihead attention
        2. Encoder
        3. Decoder
        4. Putting it all together
    4. Transformer language models
      1. Bidirectional encoder representations from transformers
        1. Input data representation
        2. Pretraining
        3. Fine-tuning
      2. Transformer-XL
        1. Segment-level recurrence with state reuse
        2. Relative positional encodings
      3. XLNet
      4. Generating text with a transformer language model
    5. Summary
  17. Section 4: A Look to the Future
  18. Emerging Neural Network Designs
    1. Introducing Graph NNs
      1. Recurrent GNNs
      2. Convolutional Graph Networks
        1. Spectral-based convolutions
        2. Spatial-based convolutions with attention
      3. Graph autoencoders
      4. Neural graph learning
        1. Implementing graph regularization
    2. Introducing memory-augmented NNs
      1. Neural Turing machines
      2. MANN*
    3. Summary
  19. Meta Learning
    1. Introduction to meta learning
      1. Zero-shot learning
      2. One-shot learning
      3. Meta-training and meta-testing
    2. Metric-based meta learning
      1. Matching networks for one-shot learning
      2. Siamese networks
        1. Implementing Siamese networks
      3. Prototypical networks
    3. Optimization-based learning
    4. Summary
  20. Deep Learning for Autonomous Vehicles
    1. Introduction to AVs
      1. Brief history of AV research
      2. Levels of automation
    2. Components of an AV system
      1. Environment perception
        1. Sensing
        2. Localization
        3. Moving object detection and tracking
      2. Path planning
    3. Introduction to 3D data processing
    4. Imitation driving policy
      1. Behavioral cloning with PyTorch
        1. Generating the training dataset
        2. Implementing the agent neural network
        3. Training
        4. Letting the agent drive
        5. Putting it all together
    5. Driving policy with ChauffeurNet
      1. Input and output representations
      2. Model architecture
      3. Training
    6. Summary
  21. Other Books You May Enjoy
    1. Leave a review - let other readers know what you think

Product information

  • Title: Advanced Deep Learning with Python
  • Author(s): Ivan Vasilev
  • Release date: December 2019
  • Publisher(s): Packt Publishing
  • ISBN: 9781789956177