Skip to content
  • Sign In
  • Try Now
View all events
Deep Learning

Deep Learning for Beginners in 3 Weeks

Published by O'Reilly Media, Inc.

Beginner to intermediate content levelBeginner to intermediate

From basics to production with NumPy and TensorFlow

This live event utilizes interactive environments

In this course, you’ll:

  • Explore neural network basics and forward propagation
  • Understand backpropagation and stochastic gradient descent
  • Perform train/test splits and examine production case studies

Course description

Neural networks and deep learning have captured attention for years, and understandably so. Problems that were previously hard for computers to solve like computer vision and audio recognition have made enormous strides due to these powerful techniques. But how do these mysterious black boxes work? Join expert Thomas Nield to learn how to build a neural network using popular libraries NumPy and TensorFlow. By understanding deep learning and neural networks and their strengths and limitations, you’ll recognize when to apply them. You’ll pick up best practices, such as train/test splits and confusion matrices, and you’ll discuss use cases and practical concerns involved in implementing a deep learning project.

Week 1: The Anatomy of a Neural Network

During week 1, you’ll see the structure of a neural network and deep learning by exploring weights, biases, layers, and activation functions. A simple example of predicting a light or dark font for a given background color will introduce key concepts and show how inputted data triggers a prediction through forward propagation.

Week 2: Backpropagation and Stochastic Gradient Descent

In this session, you’ll explore how to optimize a neural network. You’ll learn about derivatives, understand how to find the right weight and bias parameters, and use gradient descent and stochastic gradient descent to modify weights and biases until the neural network is trained and optimized. You’ll also see a demonstration of NumPy and TensorFlow library implementations of neural networks/deep learning.

Week 3: Validation Practices and Production Concerns

In the final week, you’ll explore train/test splits and other practices to verify the fit of a deep learning model and get an overview of practical use cases such as computer vision, self-driving cars, and reinforcement learning.

NOTE: With today’s registration, you’ll be signed up for all three sessions. Although you can attend any of the sessions individually, we recommend participating in all three weeks.

What you’ll learn and how you can apply it

Week 1: The Anatomy of a Neural Network

  • Build a neural network using NumPy and TensorFlow
  • Perform linear algebra operations on matrices containing weights and bias values
  • Explain clearly how neural networks and deep learning work

Week 2: Backpropagation and Stochastic Gradient Descent

  • Explore partial derivatives and target them to weight and bias parameters
  • Perform gradient descent and stochastic gradient descent to perform training
  • Understand backpropagation to “train” a neural network using libraries like NumPy, scikit-learn, and TensorFlow

Week 3: Validation Practices and Production Concerns

  • Apply best practices like train/test/validation splits as well as confusion matrices
  • Provide explanations on how neural networks operate and communicate risks given the application
  • Capture a larger context of a neural network project and the larger system it operates in

This live event is for you because...

  • You’re a budding data science professional or software engineer seeking to understand neural networks and deep learning.
  • You’re a data science professional or software engineer who wants to understand the use cases, context, and appropriate practices for implementing a deep learning system.
  • You have some Python proficiency and want to learn about neural networks and deep learning.
  • You’re a project manager who is being asked to apply deep learning to a business.

Prerequisites

Recommended follow-up:

Schedule

The time frames are only estimates and may vary according to how the class is progressing.

Week 1: Anatomy of a Neural Network

Neural networks (60 minutes)

  • Presentation: Neural network structure; neural network types (linear, CNN, RNN, GAN, etc.); optimization of weights and biases
  • Group discussion: Speculating on the strengths and limitations of a neural network
  • Q&A
  • Break

Forward propagation (50 minutes)

  • Presentation: Turning layers, weights, and biases into matrices; structuring training/input data as a matrix; forward propagation; NumPy and TensorFlow implementation
  • Hands-on exercises: Build weight and bias parameters as matrices; implement forward propagation using TensorFlow

Wrap-up and Q&A (10 minutes)

Week 2: Backpropagation and Stochastic Gradient Descent

Derivatives and partial derivatives (60 minutes)

  • Presentation: How optimization works in deep learning; partial derivatives and the chain rule; calculating derivatives for parameters
  • Hands-on exercises: Calculate the partial derivatives; calculate derivatives using chain rule
  • Q&A
  • Break

Backpropagation and gradient descent (50 minutes)

  • Presentation: Backpropagation in deep learning; stochastic gradient descent; implementations using NumPy and TensorFlow
  • Hands-on exercises: Calculate the gradients for a matrix; train a neural network using TensorFlow

Wrap-up and Q&A (10 minutes)

Week 3: Validation Practices and Production Concerns

Test splits and confusion matrices (60 minutes)

  • Presentation: Overfitting versus variance; train/test splits; confusion matrices; common mistakes
  • Hands-on exercise: Perform a train/test split
  • Q&A
  • Break

Convolutional neural networks and production (50 minutes)

  • Group discussion: How do we train a neural network on images?
  • Presentation: Convolutional neural networks with TensorFlow; reinforcement learning case studies; self-driving car case studies; safety and ethical issues
  • Hands-on exercise: Train a digit recognizer using MNIST dataset

Wrap-up and Q&A (10 minutes)

Your Instructor

  • Thomas Nield

    Thomas Nield is the founder of Nield Consulting Group and an instructor at O’Reilly Media and the University of Southern California, teaching classes on data analysis, machine learning, mathematical optimization, AI system safety, and practical artificial intelligence. He’s authored multiple books including Getting Started with SQL and Essential Math for Data Science, both for O’Reilly. He’s also the founder and inventor of Yawman Flight, a company that develops universal handheld controls for flight simulation and unmanned aerial vehicles. Thomas enjoys making technical content relatable and relevant to those unfamiliar with or intimidated by it.

    Xlinksearch