Ensemble Methods for Machine Learning, Video Edition

Video description

In Video Editions the narrator reads the book while the content, figures, code listings, diagrams, and text appear on the screen. Like an audiobook that you can also watch as a video.

Ensemble machine learning combines the power of multiple machine learning approaches, working together to deliver models that are highly performant and highly accurate.

Inside Ensemble Methods for Machine Learning you will find:

  • Methods for classification, regression, and recommendations
  • Sophisticated off-the-shelf ensemble implementations
  • Random forests, boosting, and gradient boosting
  • Feature engineering and ensemble diversity
  • Interpretability and explainability for ensemble methods

Ensemble machine learning trains a diverse group of machine learning models to work together, aggregating their output to deliver richer results than a single model. Now in Ensemble Methods for Machine Learning you’ll discover core ensemble methods that have proven records in both data science competitions and real-world applications. Hands-on case studies show you how each algorithm works in production. By the time you're done, you'll know the benefits, limitations, and practical methods of applying ensemble machine learning to real-world data, and be ready to build more explainable ML systems.

About the Technology
Automatically compare, contrast, and blend the output from multiple models to squeeze the best results from your data. Ensemble machine learning applies a “wisdom of crowds” method that dodges the inaccuracies and limitations of a single model. By basing responses on multiple perspectives, this innovative approach can deliver robust predictions even without massive datasets.

About the Book
Ensemble Methods for Machine Learning teaches you practical techniques for applying multiple ML approaches simultaneously. Each chapter contains a unique case study that demonstrates a fully functional ensemble method, with examples including medical diagnosis, sentiment analysis, handwriting classification, and more. There’s no complex math or theory—you’ll learn in a visuals-first manner, with ample code for easy experimentation!

What's Inside
  • Bagging, boosting, and gradient boosting
  • Methods for classification, regression, and retrieval
  • Interpretability and explainability for ensemble methods
  • Feature engineering and ensemble diversity


About the Reader
For Python programmers with machine learning experience.

About the Author
Gautam Kunapuli has over 15 years of experience in academia and the machine learning industry.

Quotes
An excellent guide to ensemble learning with concepts, code, and examples.
- Peter V. Henstock, Machine Learning and AI Lead, Pfizer Inc.; Advanced AI/ML Lecturer, Harvard Extension School

Extremely valuable for more complex scenarios that single models aren’t able to accurately capture.
- McHughson Chambers, Roy Hobbs Diamond Enterprise

Ensemble methods are a valuable tool. I can aggregate the strengths from multiple methods while mitigating their individual weaknesses and increasing model performance.
- Noah Flynn, Amazon

Step by step and with clear descriptions. Very understandable.
- Oliver Korten, ORONTEC

Table of contents

  1. Part 1. The basics of ensembles
  2. Chapter 1. Ensemble methods: Hype or hallelujah?
  3. Chapter 1. Why you should care about ensemble learning
  4. Chapter 1. Fit vs. complexity in individual models
  5. Chapter 1. Our first ensemble
  6. Chapter 1. Terminology and taxonomy for ensemble methods
  7. Chapter 1. Summary
  8. Part 2 Essential ensemble methods
  9. Chapter 2. Homogeneous parallel ensembles: Bagging and random forests
  10. Chapter 2. Bagging: Bootstrap aggregating
  11. Chapter 2. Random forests
  12. Chapter 2. More homogeneous parallel ensembles
  13. Chapter 2. Case study: Breast cancer diagnosis
  14. Chapter 2. Summary
  15. Chapter 3. Heterogeneous parallel ensembles: Combining strong learners
  16. Chapter 3. Combining predictions by weighting
  17. Chapter 3. Combining predictions by meta-learning
  18. Chapter 3. Case study: Sentiment analysi
  19. Chapter 3. Summary
  20. Chapter 4. Sequential ensembles: Adaptive boosting
  21. Chapter 4. AdaBoost: Adaptive boosting
  22. Chapter 4. AdaBoost in practice
  23. Chapter 4. Case study: Handwritten digit classification
  24. Chapter 4. LogitBoost: Boosting with the logistic loss
  25. Chapter 4. Summary
  26. Chapter 5. Sequential ensembles: Gradient boosting
  27. Chapter 5. Gradient boosting: Gradient descent + boosting
  28. Chapter 5. LightGBM: A framework for gradient boosting
  29. Chapter 5. LightGBM in practice
  30. Chapter 5. Case study: Document retrieval
  31. Chapter 5. Summary
  32. Chapter 6. Sequential ensembles: Newton boosting
  33. Chapter 6. Newton boosting: Newton’s method + boosting
  34. Chapter 6. XGBoost: A framework for Newton boosting
  35. Chapter 6. XGBoost in practice
  36. Chapter 6. Case study redux: Document retrieval
  37. Chapter 6. Summary
  38. Part 3. Ensembles in the wild: Adapting ensemble methods to your data
  39. Chapter 7. Learning with continuous and count labels
  40. Chapter 7. Parallel ensembles for regression
  41. Chapter 7. Sequential ensembles for regression
  42. Chapter 7. Case study: Demand forecasting
  43. Chapter 7. Summary
  44. Chapter 8. Learning with categorical features
  45. Chapter 8. CatBoost: A framework for ordered boosting
  46. Chapter 8. Case study: Income prediction
  47. Chapter 8. Encoding high-cardinality string features
  48. Chapter 8. Summary
  49. Chapter 9. Explaining your ensembles
  50. Chapter 9. Case study: Data-driven marketing
  51. Chapter 9. Black-box methods for global explainability
  52. Chapter 9. Black-box methods for local explainability
  53. Chapter 9. Glass-box ensembles: Training for interpretability
  54. Chapter 9. Summary
  55. Epilogue

Product information

  • Title: Ensemble Methods for Machine Learning, Video Edition
  • Author(s): Gautam Kunapuli
  • Release date: May 2023
  • Publisher(s): Manning Publications
  • ISBN: None