Intro to Calculus (Machine Learning Foundations)
Published by Pearson
Learn Differential Calculus Hands-on with Python and Interactive Exercises
- Introduction to the Core of Calculus: This class serves as an essential introduction to calculus, focusing on the concepts of limits and derivatives, which form the basis of understanding rates of change—a fundamental aspect in optimizing machine learning algorithms.
- Critical for Machine Learning Optimization: By emphasizing the computation of derivatives through differentiation, the course directly supports the mastery of optimization techniques crucial for machine learning, including backpropagation and stochastic gradient descent used in deep learning.
- Foundational for Advanced Studies: As the gateway to more complex topics in calculus and machine learning, this course lays the groundwork necessary for tackling advanced classes in the Machine Learning Foundations series, particularly Calculus II-IV and the comprehensive class on Optimization, ensuring a robust mathematical foundation for future learning.
The Machine Learning Foundations series of online trainings provides a comprehensive overview of all the subjects — mathematics, statistics, and computer science — that underlie contemporary machine learning techniques, including deep learning and other artificial intelligence approaches. Extensive curriculum detail can be found at the course’s GitHub repo.
All of the classes in the ML Foundations series bring theory to life through the combination of vivid full-color illustrations, straightforward Python examples within hands-on Jupyter notebook demos, and comprehension exercises with fully worked solutions.
The focus is on providing you with a practical, functional understanding of the content covered. Context will be given for each topic, highlighting its relevance to machine learning. You will be better positioned to understand cutting-edge machine learning papers and you will be provided with resources for digging even deeper into topics that pique your curiosity.
There are 14 classes in the series, organized into four subject areas:
1. Linear Algebra (three classes)
- Linear Algebra for Machine Learning: Intro
- Linear Algebra for Machine Learning, Level II: Matrix Tensors
- Linear Algebra for Machine Learning, Level III: Eigenvectors
2. Calculus (four classes)
- Calculus for Machine Learning: Intro
- Calculus for Machine Learning, Level II: Automatic Differentiation
- Calculus for Machine Learning, Level III: Partial Derivatives
- Calculus for Machine Learning, Level IV: Gradients & Integrals
3. Probability and Statistics (four classes)
- Intro to Probability
- Probability II and Information Theory
- Intro to Statistics
- Statistics II: Regression and Bayesian
4. Computer Science (three classes)
- Intro to Data Structures and Algorithms
- DSA II: Hashing, Trees, and Graphs
- Optimization
Each of the four subject areas are fairly independent; however, theory within a given subject area generally builds over the 3-4 classes — topics in later classes of a given subject area often assume an understanding of topics from earlier classes. Work through the individual classes based on your particular interests or your existing familiarity with the material.
(Note that at any given time, only a subset of the ML Foundations classes will be scheduled and open for registration. To be pushed notifications of upcoming classes in the series, sign up for the instructor’s email newsletter at jonkrohn.com.)
This class, Calculus for Machine Learning: Intro, introduces the mathematical field of calculus — the study of rates of change — from the ground up. It is essential because computing derivatives via differentiation is the basis of optimizing most machine learning algorithms, including those used in deep learning such as backpropagation and stochastic gradient descent. Through the measured exposition of theory paired with interactive examples, you’ll develop a working understanding of how calculus is used to compute limits and differentiate functions. The content covered in this class is itself foundational for several other classes in the Machine Learning Foundations series, especially Calculus II-IV and the final class in the series, on Optimization.
What you’ll learn and how you can apply it
- Appreciate how calculus works, from first principles, via interactive code demos in Python.
- Compute the derivatives of functions by hand using the differentiation rules.
- Be able to more easily apprehend other subjects that underlie ML, including partial-derivative calculus, statistics and optimization algorithms.
- Be able to more intimately grasp the details of machine learning papers and textbooks.
- Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning.
This live event is for you because...
- You use high-level software (e.g., scikit-learn, the Keras API, PyTorch Lightning) to train or deploy machine learning algorithms, and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities
- You’re a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
- You’re a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline
- You’re a data analyst or AI enthusiast who would like to become a data scientist or data/ML engineer, and so you’re keen to deeply understand the field you’re entering from the ground up (very wise of you!)
Prerequisites
- Programming: All code demos will be in Python so experience with it or another object-oriented programming language would be helpful for following along with the code examples.
- Mathematics: Familiarity with secondary school-level mathematics will make the class easier to follow along with. If you are comfortable dealing with quantitative information — such as understanding charts and rearranging simple equations — then you should be well-prepared to follow along with all of the mathematics.
Course Set-up
- During class, we’ll work on Jupyter notebooks interactively in the cloud via Google Colab. This requires zero setup and instructions will be provided in class.
Recommended Preparation
- Watch: Calculus for Machine Learning LiveLessons, Lessons 1-3 (for a head start on the content covered in class) by Jon Krohn
Note: The remainder of Jon’s ML Foundations curriculum is split across the following videos:
- Algebra for Machine Learning LiveLessons
- Probability and Statistics for Machine Learning LiveLessons
- Data Structures, Algorithms, and Machine Learning Optimization LiveLessons
Recommended Follow-up
- Attend: Calculus for Machine Learning, Level II: Automatic Differentiation (ML Foundations Series) by Jon Krohn
- Watch: Calculus for ML LiveLessons by Jon Krohn
- Explore: Math for Machine Learning by Jon Krohn
- Explore: Deep Learning: The Complete Guide by Jon Krohn
Schedule
The time frames are only estimates and may vary according to how the class is progressing.
Segment 1: Orientation to Calculus (60 min)
- What Calculus Is
- A Brief History of Calculus
- The Method of Exhaustion
- Calculating Limits
Q&A: 5 minutes
Break: 10 minutes
Segment 2: Understanding Differentiation (60 min)
- The Delta Method
- The Differentiation Equation
- Derivative Notation
Q&A: 5 minutes
Break: 10 minutes
Segment 3: Differentiation Rules (60 min)
- The Power Rule
- The Constant Multiple Rule
- The Sum Rule
Final Exercises
Q&A: 15 minutes
Course wrap-up and next steps (15 minutes)
Your Instructor
Jon Krohn
Jon Krohn is Co-Founder and Chief Data Scientist at the machine learning company Nebula. He authored the book Deep Learning Illustrated, an instant #1 bestseller that was translated into seven languages. He is also the host of SuperDataScience, the data science industry’s most listened-to podcast. Jon is renowned for his compelling lectures, which he offers at leading universities and conferences, as well as via his award-winning YouTube channel. He holds a PhD from Oxford and has been publishing on machine learning in prominent academic journals since 2010.