Hyperparameter Tuning with Python

Book description

Take your machine learning models to the next level by learning how to leverage hyperparameter tuning, allowing you to control the model's finest details

Key Features

  • Gain a deep understanding of how hyperparameter tuning works
  • Explore exhaustive search, heuristic search, and Bayesian and multi-fidelity optimization methods
  • Learn which method should be used to solve a specific situation or problem

Book Description

Hyperparameters are an important element in building useful machine learning models. This book curates numerous hyperparameter tuning methods for Python, one of the most popular coding languages for machine learning. Alongside in-depth explanations of how each method works, you will use a decision map that can help you identify the best tuning method for your requirements.

You'll start with an introduction to hyperparameter tuning and understand why it's important. Next, you'll learn the best methods for hyperparameter tuning for a variety of use cases and specific algorithm types. This book will not only cover the usual grid or random search but also other powerful underdog methods. Individual chapters are also dedicated to the three main groups of hyperparameter tuning methods: exhaustive search, heuristic search, Bayesian optimization, and multi-fidelity optimization. Later, you will learn about top frameworks like Scikit, Hyperopt, Optuna, NNI, and DEAP to implement hyperparameter tuning. Finally, you will cover hyperparameters of popular algorithms and best practices that will help you efficiently tune your hyperparameter.

By the end of this book, you will have the skills you need to take full control over your machine learning models and get the best models for the best results.

What you will learn

  • Discover hyperparameter space and types of hyperparameter distributions
  • Explore manual, grid, and random search, and the pros and cons of each
  • Understand powerful underdog methods along with best practices
  • Explore the hyperparameters of popular algorithms
  • Discover how to tune hyperparameters in different frameworks and libraries
  • Deep dive into top frameworks such as Scikit, Hyperopt, Optuna, NNI, and DEAP
  • Get to grips with best practices that you can apply to your machine learning models right away

Who this book is for

This book is for data scientists and ML engineers who are working with Python and want to further boost their ML model's performance by using the appropriate hyperparameter tuning method. Although a basic understanding of machine learning and how to code in Python is needed, no prior knowledge of hyperparameter tuning in Python is required.

Table of contents

  1. Hyperparameter Tuning with Python
  2. Contributors
  3. About the author
  4. About the reviewer
  5. Preface
    1. Who this book is for
    2. What this book covers
    3. To get the most out of this book
    4. Download the example code files
    5. Download the color images
    6. Conventions used
    7. Get in touch
    8. Share Your Thoughts
  6. Section 1:The Methods
  7. Chapter 1: Evaluating Machine Learning Models
    1. Technical requirements
    2. Understanding the concept of overfitting
    3. Creating training, validation, and test sets
    4. Exploring random and stratified splits
    5. Discovering repeated k-fold cross-validation
    6. Discovering Leave-One-Out cross-validation
    7. Discovering LPO cross-validation
    8. Discovering time-series cross-validation
    9. Summary
    10. Further reading
  8. Chapter 2: Introducing Hyperparameter Tuning
    1. What is hyperparameter tuning?
    2. Demystifying hyperparameters versus parameters
    3. Understanding hyperparameter space 
and distributions
    4. Summary
  9. Chapter 3: Exploring Exhaustive Search
    1. Understanding manual search
    2. Understanding grid search
    3. Understanding random search
    4. Summary
  10. Chapter 4: Exploring Bayesian Optimization
    1. Introducing BO
    2. Understanding BO GP
    3. Understanding SMAC
    4. Understanding TPE
    5. Understanding Metis
    6. Summary
  11. Chapter 5: Exploring Heuristic Search
    1. Understanding simulated annealing
    2. Understanding genetic algorithms
    3. Understanding particle swarm optimization
    4. Understanding Population-Based Training
    5. Summary
  12. Chapter 6: Exploring Multi-Fidelity Optimization
    1. Introducing MFO
    2. Understanding coarse-to-fine search
    3. Understanding successive halving
    4. Understanding hyper band
    5. Understanding BOHB
    6. Summary
  13. Section 2:The Implementation
  14. Chapter 7: Hyperparameter Tuning via Scikit
    1. Technical requirements
    2. Introducing Scikit
    3. Implementing Grid Search
    4. Implementing Random Search
    5. Implementing Coarse-to-Fine Search
    6. Implementing Successive Halving
    7. Implementing Hyper Band
    8. Implementing Bayesian Optimization Gaussian Process
    9. Implementing Bayesian Optimization Random Forest
    10. Implementing Bayesian Optimization Gradient Boosted Trees
    11. Summary
  15. Chapter 8: Hyperparameter Tuning via Hyperopt
    1. Technical requirements
    2. Introducing Hyperopt
    3. Implementing Random Search
    4. Implementing Tree-structured Parzen Estimators
    5. Implementing Adaptive TPE
    6. Implementing simulated annealing
    7. Summary
  16. Chapter 9: Hyperparameter Tuning via Optuna
    1. Technical requirements
    2. Introducing Optuna
    3. Implementing TPE
    4. Implementing Random Search
    5. Implementing Grid Search
    6. Implementing Simulated Annealing
    7. Implementing Successive Halving
    8. Implementing Hyperband
    9. Summary
  17. Chapter 10: Advanced Hyperparameter Tuning with DEAP and Microsoft NNI
    1. Technical requirements
    2. Introducing DEAP
    3. Implementing the Genetic Algorithm
    4. Implementing Particle Swarm Optimization
    5. Introducing Microsoft NNI
    6. Implementing Grid Search
    7. Implementing Random Search
    8. Implementing Tree-structured Parzen Estimators
    9. Implementing Sequential Model Algorithm Configuration
    10. Implementing Bayesian Optimization Gaussian Process
    11. Implementing Metis
    12. Implementing Simulated Annealing
    13. Implementing Hyper Band
    14. Implementing Bayesian Optimization Hyper Band
    15. Implementing Population-Based Training
    16. Summary
  18. Section 3:Putting Things into Practice
  19. Chapter 11: Understanding the Hyperparameters of Popular Algorithms
    1. Exploring Random Forest hyperparameters
    2. Exploring XGBoost hyperparameters
    3. Exploring LightGBM hyperparameters
    4. Exploring CatBoost hyperparameters
    5. Exploring SVM hyperparameters
    6. Exploring artificial neural network hyperparameters
    7. Summary
  20. Chapter 12: Introducing Hyperparameter Tuning Decision Map
    1. Getting familiar with HTDM
    2. Case study 1 – using HTDM with a CatBoost classifier
    3. Case study 2 – using HTDM with a conditional hyperparameter space
    4. Case study 3 – using HTDM with prior knowledge of the hyperparameter values
    5. Summary
  21. Chapter 13: Tracking Hyperparameter Tuning Experiments
    1. Technical requirements
    2. Revisiting the usual practices
      1. Using a built-in Python dictionary
      2. Using a configuration file
      3. Using additional modules
    3. Exploring Neptune
    4. Exploring scikit-optimize
    5. Exploring Optuna
    6. Exploring Microsoft NNI
    7. Exploring MLflow
    8. Summary
  22. Chapter 14: Conclusions and Next Steps
    1. Revisiting hyperparameter tuning methods and packages
    2. Revisiting HTDM
    3. What’s next?
    4. Summary
    5. Why subscribe?
  23. Other Books You May Enjoy
    1. Packt is searching for authors like you
    2. Share Your Thoughts

Product information

  • Title: Hyperparameter Tuning with Python
  • Author(s): Louis Owen
  • Release date: July 2022
  • Publisher(s): Packt Publishing
  • ISBN: 9781803235875