Skip to content
  • Sign In
  • Try Now
View all events
Machine Learning

Calculus III: Partial Derivatives (Machine Learning Foundations)

Published by Pearson

Intermediate to advanced content levelIntermediate to advanced

Applying Multivariate Calculus to Compute Gradients on Paper and with Python

The Machine Learning Foundations series of online trainings provides a comprehensive overview of all the subjects — mathematics, statistics, and computer science — that underlie contemporary machine learning techniques, including deep learning and other artificial intelligence approaches. Extensive curriculum detail can be found at the course’s GitHub repo.

All classes in the ML Foundations series bring theory to life through the combination of vivid full-color illustrations, straightforward Python examples within hands-on Jupyter notebook demos, and comprehension exercises with fully worked solutions.

The focus is on providing you with a practical, functional understanding of the content covered. Context will be given for each topic, highlighting its relevance to machine learning. You will be better positioned to understand cutting-edge machine learning papers and you will be provided with resources for digging even deeper into topics that pique your curiosity.

There are 14 classes in the series, organized into four subject areas:

Linear Algebra (three classes)

  • Intro to Linear Algebra
  • Linear Algebra II: Matrix Tensors
  • Linear Algebra III: Eigenvectors

Calculus (four classes)

  • Intro to Calculus
  • Calculus II: Automatic Differentiation
  • Calculus III: Partial Derivatives
  • Calculus IV: Gradients and Integrals

Probability and Statistics (four classes)

  • Intro to Probability
  • Probability II and Information Theory
  • Intro to Statistics
  • Statistics II: Regression and Bayesian

Computer Science (three classes)

  • Intro to Data Structures and Algorithms
  • DSA II: Hashing, Trees, and Graphs
  • Optimization

You’re welcome to pick and choose between any of the 14 individual classes based on your interests or your existing familiarity with the material. Note that each of the four subject areas are independent, however theory within a given subject area generally builds over the 3-4 classes — topics in later classes of a given subject area often assume an understanding of topics from earlier classes.

(Note that at any given time, only a subset of the ML Foundations classes will be scheduled and open for registration. To be pushed notifications of upcoming classes in the series, sign up for the instructor’s email newsletter at jonkrohn.com.

This class, Calculus III: Partial Derivatives, builds on the single-variable derivative calculus of Calculus I-II to introduce partial derivatives. Understanding partial derivatives is essential because they are collected to form gradients of learning. It is by adjusting model parameters based on these gradients that most machine learning algorithms are able to learn from data (i.e., via gradient descent). The content covered in this class is itself foundational for several other classes in the Machine Learning Foundations series, especially Calculus IV and the final class in the series, on Optimization.

What you’ll learn and how you can apply it

  • Deeply understand the details of the partial-derivative, multivariate calculus that is common in machine learning papers and textbooks, as well as in many other subjects that underlie ML, including information theory and optimization algorithms.
  • Be able to calculate the partial derivatives of cost functions by hand as well as with automatic-differentiation libraries like PyTorch and TensorFlow
  • Grasp exactly what gradients are and appreciate why they are essential for enabling ML via gradient descent.
  • Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning.

This live event is for you because...

  • You use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms, and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities
  • You’re a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
  • You’re a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline
  • You’re a data analyst or A.I. enthusiast who would like to become a data scientist or data/ML engineer, and so you’re keen to deeply understand the field you’re entering from the ground up (very wise of you!)

Prerequisites

  • Programming: All code demos will be in Python so experience with it or another object-oriented programming language would be helpful for following along with the code examples.
  • Mathematics: Familiarity with secondary school-level mathematics will make the class easier to follow along with. If you are comfortable dealing with quantitative information — such as understanding charts and rearranging simple equations — then you should be well-prepared to follow along with all the mathematics.
  • During class, we’ll work on Jupyter notebooks interactively in the cloud via Google Colab. This requires zero setup and instructions will be provided in class.

Resources:

  • If you’re feeling extremely ambitious, you can get a head start on the content we’ll be covering in class by viewing Lessons 1-3 of Jon Krohn’s Calculus for ML LiveLessons.

The remainder of Jon’s ML Foundations curriculum is split across the following videos:

Schedule

The time frames are only estimates and may vary according to how the class is progressing.

Segment 1: Review of Single-Variable Calculus (30 min)

  • The Delta Method
  • Differentiation with Rules
  • AutoDiff: Automatic Differentiation
  • Q&A and Break

Segment 2: Partial Derivatives (110 min)

  • Partial Derivatives of Multivariate Functions
  • Partial Derivative Exercises
  • Geometrical Examples of Partial Derivatives
  • Geometrical Exercises
  • Notation
  • The Partial-Derivative Chain Rule
  • Chain Rule Exercises
  • Q&A and Break

Segment 3: Gradients (70 min)

  • Single-Point Regression
  • Cost (or Loss) Functions
  • Quadratic Cost
  • Partial Derivatives of Quadratic Cost
  • Gradients
  • Descending the Gradient of Quadratic Cost
  • Final Exercises and Q&A

Your Instructor

  • Jon Krohn

    Jon Krohn is Co-Founder and Chief Data Scientist at the machine learning company Nebula. He authored the book Deep Learning Illustrated, an instant #1 bestseller that was translated into seven languages. He is also the host of SuperDataScience, the data science industry’s most listened-to podcast. Jon is renowned for his compelling lectures, which he offers at leading universities and conferences, as well as via his award-winning YouTube channel. He holds a PhD from Oxford and has been publishing on machine learning in prominent academic journals since 2010.

    linkedinXlinksearch