Calculus for Machine Learning LiveLessons

Video description

6+ Hours of Video Instruction
 An introduction to the calculus behind machine learning models

Overview

Calculus for Machine Learning LiveLessons introduces the mathematical field of calculus—the study of rates of change—from the ground up. It is essential because computing derivatives via differentiation is the basis of optimizing most machine learning algorithms, including those used in deep learning such as backpropagation and stochastic gradient descent. Through the measured exposition of theory paired with interactive examples, you’ll develop a working understanding of how calculus is used to compute limits and differentiate functions. You’ll also learn how to apply automatic differentiation within the popular TensorFlow 2 and PyTorch machine learning libraries. Later lessons build on single-variable derivative calculus to detail gradients of learning (which are facilitated by partial-derivative calculus) and integral calculus (which determines the area under a curve and comes in handy for myriad tasks associated with machine learning).

Skill Level
Intermediate

 Learn How To
--Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning. 
--Compute the derivatives of functions, including by using AutoDiff in the popular TensorFlow 2 and PyTorch libraries.
--Be able to grasp the details of the partial-derivative, multivariate calculus that is common in machine learning papers and in many other subjects that underlie ML, including information theory and optimization algorithms. 
--Use integral calculus to determine the area under any given curve, a recurring task in ML applied, for example, to evaluate model performance by calculating the ROC AUC metric.

 Who Should Take This Course
 --People who use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms and would like to understand the fundamentals underlying the abstractions, enabling them to expand their capabilities
--Software developers who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
--Data scientists who would like to reinforce their understanding of the subjects at the core of their professional discipline
--Data analysts or AI enthusiasts who would like to become data scientists or data/ML engineers, and so are keen to deeply understand the field they’re entering from the ground up (a very wise choice!) 


Course Requirements
--Mathematics: Familiarity with secondary school–level mathematics will make the class easier to follow along with. If you are comfortable dealing with quantitative information, such as understanding charts and rearranging simple equations, you should be well prepared to follow along with all the mathematics.
--Programming: All code demos are in Python, so experience with it or another object-oriented programming language would be helpful for following along with the hands-on examples.

Lesson Descriptions:
Lesson 1, “Orientation to Calculus”: In Lesson 1, Jon defines calculus by distinguishing between differential and integral calculus. This is followed by a brief history of calculus that runs all the way through the modern applications, with a particular emphasis on its application to machine learning.

Lesson 2, “Limits”: Lesson 2 begins with a discussion of continuous versus discontinuous functions. Then Jon covers evaluating limits by both factoring and approaching methods. Next, he discusses what happens to limits when approaching infinity. The lesson concludes with comprehension exercises.

Lesson 3, “Differentiation”: In Lesson 3 Jon focuses on differential calculus. He covers the delta method for finding the slope of a curve and using it to derive the most common representation of a differentiation. After Jon takes a quick look at derivative notation, he introduces the most common differentiation rules: the constant rule, the power rule, the constant product rule, and the sum rule. Exercises wind up the lesson. 

Lesson 4, “Advanced Differentiation Rules”: Lesson 4 continues differentiation, covering its advanced rules. These include the product rule, the quotient rule, and the chain rule. After some exercises Jon unleashes the might of the power rule in situations where you have a series of functions chained together.

Lesson 5, “Automatic Differentiation”: Lesson 5 enables you to move beyond differentiation by hand to scaling it up through automatic differentiation. This is accomplished through the PyTorch and TensorFlow libraries. After representing a line as a graph you will apply automatic differentiation to fitting that line to data points with machine learning.

Lesson 6, “Partial Derivatives”: Lesson 6 delves into partial derivatives. Jon begins with simple derivatives of multivariate functions, followed by more advanced geometrical examples, partial derivative notation, and the partial derivative chain rule.

Lesson 7, “Gradients”: Lesson 7 covers the gradient, which captures the partial derivative of cost with respect to all the parameters of the machine learning model from the previous lessons. To understand this, Jon performs a regression on individual data points and the partial derivatives of the quadratic cost. From there, he discusses what it means to descend the gradient of cost and describes the derivation of the partial derivatives of mean squared error, which enables you to learn from batches of data instead of individual points.

Lesson 8, “Integrals”: Lesson 8 switches to integral calculus. To set up a machine learning problem that requires integration to solve it, Jon starts off with binary classification problems, the confusion matrix, and ROC curve. With that problem in mind, Jon then covers the rules of indefinite and definite integral calculus needed to solve it. Next, Jon shows you how to do integration computationally. You learn how to use Python to find the area under the ROC curve. Finally, he ends the lessons with some resources for further study.


About Pearson Video Training
Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Prentice Hall, Sams, and Que. Topics include IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at http://www.informit.com/video.

Table of contents

  1. Introduction
    1. Calculus for Machine Learning LiveLessons (Video Training): Introduction
  2. Lesson 1: Orientation to Calculus
    1. Topics
    2. 1.1 Differential versus Integral Calculus
    3. 1.2 A Brief History
    4. 1.3 Calculus of the Infinitesimals
    5. 1.4 Modern Applications
  3. Lesson 2: Limits
    1. Topics
    2. 2.1 Continuous versus Discontinuous Functions
    3. 2.2 Solving via Factoring
    4. 2.3 Solving via Approaching
    5. 2.4 Approaching Infinity
    6. 2.5 Exercises
  4. Lesson 3: Differentiation
    1. Topics
    2. 3.1 Delta Method
    3. 3.2 The Most Common Representation
    4. 3.3 Derivative Notation
    5. 3.4 Constants
    6. 3.5 Power Rule
    7. 3.6 Constant Product Rule
    8. 3.7 Sum Rule
    9. 3.8 Exercises
  5. Lesson 4: Advanced Differentiation Rules
    1. Topics
    2. 4.1 Product Rule
    3. 4.2 Quotient Rule
    4. 4.3 Chain Rule
    5. 4.4 Exercises
    6. 4.5 Power Rule on a Function Chain
  6. Lesson 5: Automatic Differentiation
    1. Topics
    2. 5.1 Introduction
    3. 5.2 Autodiff with PyTorch
    4. 5.3 Autodiff with TensorFlow
    5. 5.4 Directed Acyclic Graph of a Line Equation
    6. 5.5 Fitting a Line with Machine Learning
  7. Lesson 6: Partial Derivatives
    1. Topics
    2. 6.1 Derivatives of Multivariate Functions
    3. 6.2 Partial Derivative Exercises
    4. 6.3 Geometrical Examples
    5. 6.4 Geometrical Exercises
    6. 6.5 Notation
    7. 6.6 Chain Rule
    8. 6.7 Chain Rule Exercises
  8. Lesson 7: Gradients
    1. Topics
    2. 7.1 Single-Point Regression
    3. 7.2 Partial Derivatives of Quadratic Cost
    4. 7.3 Descending the Gradient of Cost
    5. 7.4 Gradient of Mean Squared Error
    6. 7.5 Backpropagation
    7. 7.6 Higher-Order Partial Derivatives
    8. 7.7 Exercise
  9. Lesson 8: Integrals
    1. Topics
    2. 8.1 Binary Classification
    3. 8.2 The Confusion Matrix and ROC Curve
    4. 8.3 Indefinite Integrals
    5. 8.4 Definite Integrals
    6. 8.5 Numeric Integration with Python
    7. 8.6 Exercises
    8. 8.7 Finding the Area Under the ROC Curve
    9. 8.8 Resources for Further Study of Calculus

Product information

  • Title: Calculus for Machine Learning LiveLessons
  • Author(s): Jon Krohn
  • Release date: January 2021
  • Publisher(s): Pearson
  • ISBN: 0137398177