Python Deep Learning - Third Edition

Book description

Master effective navigation of neural networks, including convolutions and transformers, to tackle computer vision and NLP tasks using Python

Key Features

  • Understand the theory, mathematical foundations and the structure of deep neural networks
  • Become familiar with transformers, large language models, and convolutional networks
  • Learn how to apply them on various computer vision and natural language processing problems Purchase of the print or Kindle book includes a free PDF eBook

Book Description

The field of deep learning has developed rapidly recently and today covers a broad range of applications. This makes it challenging to navigate and hard to understand without solid foundations. This book will guide you from the basics of neural networks to the state-of-the-art large language models in use today.

The first part of the book introduces the main machine learning concepts and paradigms. It covers the mathematical foundations, the structure, and the training algorithms of neural networks and dives into the essence of deep learning.

The second part of the book introduces convolutional networks for computer vision. We’ll learn how to solve image classification, object detection, instance segmentation, and image generation tasks.

The third part focuses on the attention mechanism and transformers – the core network architecture of large language models. We’ll discuss new types of advanced tasks they can solve, such as chatbots and text-to-image generation.

By the end of this book, you’ll have a thorough understanding of the inner workings of deep neural networks. You'll have the ability to develop new models and adapt existing ones to solve your tasks. You’ll also have sufficient understanding to continue your research and stay up to date with the latest advancements in the field.

What you will learn

  • Establish theoretical foundations of deep neural networks
  • Understand convolutional networks and apply them in computer vision applications
  • Become well versed with natural language processing and recurrent networks
  • Explore the attention mechanism and transformers
  • Apply transformers and large language models for natural language and computer vision
  • Implement coding examples with PyTorch, Keras, and Hugging Face Transformers
  • Use MLOps to develop and deploy neural network models

Who this book is for

This book is for software developers/engineers, students, data scientists, data analysts, machine learning engineers, statisticians, and anyone interested in deep learning. Prior experience with Python programming is a prerequisite.

Table of contents

  1. Python Deep Learning
  2. Contributors
  3. About the author
  4. About the reviewer
  5. Preface
    1. Who this book is for
    2. What this book covers
    3. To get the most out of this book
    4. Download the example code files
    5. Conventions used
    6. Get in touch
    7. Share Your Thoughts
    8. Download a free PDF copy of this book
  6. Part 1:Introduction to Neural Networks
  7. Chapter 1: Machine Learning – an Introduction
    1. Technical requirements
    2. Introduction to ML
    3. Different ML approaches
      1. Supervised learning
      2. Unsupervised learning
      3. Reinforcement learning
      4. Components of an ML solution
      5. Neural networks
      6. Introducing PyTorch
    4. Summary
  8. Chapter 2: Neural Networks
    1. Technical requirements
    2. The need for NNs
    3. The math of NNs
      1. Linear algebra
      2. An introduction to probability
      3. Differential calculus
    4. An introduction to NNs
      1. Units – the smallest NN building block
      2. Layers as operations
      3. Multi-layer NNs
      4. Activation functions
      5. The universal approximation theorem
    5. Training NNs
      1. GD
      2. Backpropagation
      3. A code example of an NN for the XOR function
    6. Summary
  9. Chapter 3: Deep Learning Fundamentals
    1. Technical requirements
    2. Introduction to DL
    3. Fundamental DL concepts
      1. Feature learning
      2. The reasons for DL’s popularity
    4. Deep neural networks
    5. Training deep neural networks
      1. Improved activation functions
      2. DNN regularization
    6. Applications of DL
    7. Introducing popular DL libraries
      1. Classifying digits with Keras
      2. Classifying digits with PyTorch
    8. Summary
  10. Part 2: Deep Neural Networks for Computer Vision
  11. Chapter 4: Computer Vision with Convolutional Networks
    1. Technical requirements
    2. Intuition and justification for CNNs
    3. Convolutional layers
      1. A coding example of the convolution operation
      2. Cross-channel and depthwise convolutions
      3. Stride and padding in convolutional layers
    4. Pooling layers
    5. The structure of a convolutional network
    6. Classifying images with PyTorch and Keras
      1. Convolutional layers in deep learning libraries
      2. Data augmentation
      3. Classifying images with PyTorch
      4. Classifying images with Keras
    7. Advanced types of convolutions
      1. 1D, 2D, and 3D convolutions
      2. 1×1 convolutions
      3. Depthwise separable convolutions
      4. Dilated convolutions
      5. Transposed convolutions
    8. Advanced CNN models
      1. Introducing residual networks
      2. Inception networks
      3. Introducing Xception
      4. Squeeze-and-Excitation Networks
      5. Introducing MobileNet
      6. EfficientNet
      7. Using pre-trained models with PyTorch and Keras
    9. Summary
  12. Chapter 5: Advanced Computer Vision Applications
    1. Technical requirements
    2. Transfer learning (TL)
      1. Transfer learning with PyTorch
      2. Transfer learning with Keras
    3. Object detection
      1. Approaches to object detection
      2. Object detection with YOLO
      3. Object detection with Faster R-CNN
    4. Introducing image segmentation
      1. Semantic segmentation with U-Net
      2. Instance segmentation with Mask R-CNN
    5. Image generation with diffusion models
      1. Introducing generative models
      2. Denoising Diffusion Probabilistic Models
    6. Summary
  13. Part 3: Natural Language Processing and Transformers
  14. Chapter 6: Natural Language Processing and Recurrent Neural Networks
    1. Technical requirements
    2. Natural language processing
      1. Tokenization
      2. Introducing word embeddings
      3. Word2Vec
      4. Visualizing embedding vectors
      5. Language modeling
    3. Introducing RNNs
      1. RNN implementation and training
      2. Backpropagation through time
      3. Vanishing and exploding gradients
      4. Long-short term memory
      5. Gated recurrent units
    4. Implementing text classification
    5. Summary
  15. Chapter 7: The Attention Mechanism and Transformers
    1. Technical requirements
    2. Introducing seq2seq models
    3. Understanding the attention mechanism
      1. Bahdanau attention
      2. Luong attention
      3. General attention
      4. Transformer attention
      5. Implementing TA
    4. Building transformers with attention
      1. Transformer encoder
      2. Transformer decoder
      3. Putting it all together
      4. Decoder-only and encoder-only models
      5. Bidirectional Encoder Representations from Transformers
      6. Generative Pre-trained Transformer
    5. Summary
  16. Chapter 8: Exploring Large Language Models in Depth
    1. Technical requirements
    2. Introducing LLMs
    3. LLM architecture
      1. LLM attention variants
      2. Prefix decoder
      3. Transformer nuts and bolts
      4. Models
    4. Training LLMs
      1. Training datasets
      2. Pre-training properties
      3. FT with RLHF
    5. Emergent abilities of LLMs
    6. Introducing Hugging Face Transformers
    7. Summary
  17. Chapter 9: Advanced Applications of Large Language Models
    1. Technical requirements
    2. Classifying images with Vision Transformer
      1. Using ViT with Hugging Face Transformers
    3. Understanding the DEtection TRansformer
      1. Using DetR with Hugging Face Transformers
    4. Generating images with stable diffusion
      1. Autoencoder
      2. Conditioning transformer
      3. Diffusion model
      4. Using stable diffusion with Hugging Face Transformers
    5. Exploring fine-tuning transformers
    6. Harnessing the power of LLMs with LangChain
      1. Using LangChain in practice
    7. Summary
  18. Part 4: Developing and Deploying Deep Neural Networks
  19. Chapter 10: Machine Learning Operations (MLOps)
    1. Technical requirements
    2. Understanding model development
      1. Choosing an NN framework
      2. PyTorch versus TensorFlow versus JAX
      3. Open Neural Network Exchange
      4. Introducing TensorBoard
      5. Developing NN models for edge devices with TF Lite
      6. Mixed-precision training with PyTorch
    3. Exploring model deployment
      1. Deploying NN models with Flask
      2. Building ML web apps with Gradio
    4. Summary
  20. Index
    1. Why subscribe?
  21. Other Books You May Enjoy
    1. Packt is searching for authors like you
    2. Share Your Thoughts
    3. Download a free PDF copy of this book

Product information

  • Title: Python Deep Learning - Third Edition
  • Author(s): Ivan Vasilev
  • Release date: November 2023
  • Publisher(s): Packt Publishing
  • ISBN: 9781837638505