Skip to content
  • Sign In
  • Try Now
View all events
Machine Learning

Machine Learning Foundations: Calculus II: Partial Derivatives & Integrals

Published by Pearson

Intermediate to advanced content levelIntermediate to advanced

Using Gradients in Python to Enable Algorithms to Learn from Data

The Machine Learning Foundations series of online trainings provides a comprehensive overview of all of the subjects -- mathematics, statistics, and computer science -- that underlie contemporary machine learning techniques, including deep learning and other artificial intelligence approaches.

All of the classes in the series bring theory to life through the combination of vivid full-color illustrations, straightforward Python examples within hands-on Jupyter notebook demos, and comprehension exercises with fully-worked solutions.

The focus is on providing you with a practical, functional understanding of the content covered. Context will be given for each topic, highlighting its relevance to machine learning. You will be better-positioned to understand cutting-edge machine learning papers and you will be provided with resources for digging even deeper into topics that pique your curiosity.

The eight classes in the series are organized into four couplets:

Linear Algebra

Calculus

Statistics

Computer Science

The content in the second class of each couplet follows directly from the content of the first, however you’re most welcome to pick and choose between any of the individual classes based on your particular interests or your existing familiarity with the material. (Note that at any given time, only a subset of these classes will be scheduled and open for registration. To be pushed notifications of upcoming classes in the series, sign up for the instructor’s email newsletter at jonkrohn.com.)

This class, Calculus II: Partial Derivatives & Integrals, builds on single-variable derivative calculus to introduce gradients and integral calculus. Gradients of learning, which are facilitated by partial-derivative calculus, are the basis of training most machine learning algorithms with data -- i.e., stochastic gradient descent (SGD). Paired with the principle of the chain rule (also covered in this class), SGD enables the backpropagation algorithm to train deep neural networks. Integral calculus, meanwhile, comes in handy for myriad tasks associated with machine learning, such as finding the area under the so-called “ROC curve” -- a prevailing metric for evaluating classification models. The content covered in this class is itself foundational for several other classes in the Machine Learning Foundations series, especially_ Probability & Information Theory and Optimization_.

What you’ll learn and how you can apply it

  • Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning.
  • Be able to grasp the details of the partial-derivative, multivariate calculus that is common in machine learning papers as well as many in other subjects that underlie ML, including information theory and optimization algorithms.
  • Use integral calculus to determine the area under any given curve, a recurring task in ML applied, for example, to evaluate model performance by calculating the ROC AUC metric.

This live event is for you because...

  • You use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms, and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities
  • You’re a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
  • You’re a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline
  • You’re a data analyst or A.I. enthusiast who would like to become a data scientist or data/ML engineer, and so you’re keen to deeply understand the field you’re entering from the ground up (very wise of you!)

Prerequisites

  • All code demos will be in Python so experience with it or another object-oriented programming language would be helpful for following along with the hands-on examples.

Materials, downloads, or Supplemental Content needed in advance:

  • During class, we’ll work on Jupyter notebooks interactively in the cloud via Google Colab. This requires zero setup and instructions will be provided in class.

Resources:

Schedule

The time frames are only estimates and may vary according to how the class is progressing.

Segment 1: Review of Introductory Calculus (40 min)

  • Differentiation with Rules
  • Cost (or Loss) Functions
  • Calculating the Derivative of a Cost Function
  • AutoDiff: Automatic Differentiation
  • Q&A and Break

Segment 2: Gradients Applied to Machine Learning (100 min)

  • Partial Derivatives of Multivariate Functions
  • Gradients
  • Stochastic Gradient Descent
  • The Chain Rule
  • Backpropagation
  • Q&A and Break

Segment 3: Integrals (70 min)

  • The Confusion Matrix
  • The Receiver-Operating Characteristic (ROC) Curve
  • Calculating Integrals
  • Finding the Area Under the ROC Curve
  • Resources for Further Study of Calculus
  • Final Exercises and Q&A

Your Instructor

  • Jon Krohn

    Jon Krohn is Co-Founder and Chief Data Scientist at the machine learning company Nebula. He authored the book Deep Learning Illustrated, an instant #1 bestseller that was translated into seven languages. He is also the host of SuperDataScience, the data science industry’s most listened-to podcast. Jon is renowned for his compelling lectures, which he offers at leading universities and conferences, as well as via his award-winning YouTube channel. He holds a PhD from Oxford and has been publishing on machine learning in prominent academic journals since 2010.

    linkedinXlinksearch