Skip to content
  • Sign In
  • Try Now
View all events
Transfer Learning

Hands on Transfer Learning for NLP with BERT

Published by Pearson

Intermediate content levelIntermediate

Introduction to Natural Language Processing with Next-Generation Transformer

This training will focus on implementing BERT to solve a variety of modern NLP tasks including information retrieval, sequence classification / regression, and question/answering tasks. The training will begin with a refresher on the BERT architecture and how BERT learns to model language. We will then move into examples of fine-tuning BERT on domain-specific corpora and using pre-trained models to perform NLP tasks out of the box.

BERT is one of the most relevant NLP architectures today and it is closely related to other important NLP deep learning models like GPT-3. Both of these models are derived from the newly invented transformer architecture and represent an inflection point in how machines process language and context.

The Natural Language Processing with Next-Generation Transformer Architectures series of online trainings provides a comprehensive overview of state-of-the-art natural language processing (NLP) models including GPT and BERT which are derived from the modern attention-driven transformer architecture and the applications these models are used to solve today. All of the trainings in the series blend theory and application through the combination of visual mathematical explanations, straightforward applicable Python examples within hands-on Jupyter notebook demos, and comprehensive case studies featuring modern problems solvable by NLP models. This training is part of a series and assumes that the attendee is coming in with knowledge from BERT Transformer Architecture for NLP training. (Note that at any given time, only a subset of these classes will be scheduled and open for registration.)

What you’ll learn and how you can apply it

  • The steps to using BERT: pre-training and fine-tuning
  • How BERT can be used to solve a variety of NLP tasks

This live event is for you because...

  • You’re an advanced Machine Learning Engineer with experience with Transformers, Neural Networks, and NLP
  • You’re interested in state of the art NLP Architecture
  • You are comfortable using libraries like Tensorflow or PyTorch
  • You have a base understanding of BERT

Prerequisites

  • Python 3 proficiency with some familiarity with working in interactive Python environments including Notebooks (Jupyter/ Google Colab/ Kaggle Kernels)
  • Comfort using libraries like Tensorflow or PyTorch
  • Attendees must have an understanding of BERT's architecture and outputs to receive maximum benefits from this training
  • Python 3 proficiency with some familiarity with working in interactive Python environments including Notebooks (Jupyter / Google Colab / Kaggle Kernels)
  • Comfort using libraries like Tensorflow or PyTorch

Course Set-up

  • A github repository with the slides / code / links will be provided upon completion
  • Attendees will need to have access to the notebooks in the github

Recommended Preparation

Recommended Follow-up

Schedule

The time frames are only estimates and may vary according to how the class is progressing.

Segment 1: Introduction to BERT + Language Models (30 min)

  • Introduction to Transformers and Attention
  • How BERT creates phrase and token embeddings
  • Introduction to Language Models
  • How BERT is pre-trained

Segment 2: Use-case 1: Sequence Classification + Regression (30 min)

  • Fine-tuning BERT for multi-label classification
  • Fine-tuning BERT for regression
  • Break
  • Q&A

Segment 3: Use-case 2: Grammar and Spelling Check (30 min)

  • Using BERT’s language model capabilities to suggest alternative word choices, make spelling corrections, and perform look-ahead predictions
  • Fine-tuning BERT to recognize grammatically incorrect sentences

Segment 4: Use-case 3: Question/Answering (30 min)

  • Using a pre-fine-tuned BERT to perform question/answering with SQuAD
  • Fine-tuning BERT on a custom question/answer dataset
  • Break
  • Q&A

Segment 5: Use-case 4: Knowledge base indexing and information retrieval (30 min)

  • Indexing huge corpora - textbooks and instruction manuals - and using BERT for fast context lookup
  • Tying the system to our Q/A use-case to generate answers to questions given an entire reference text

Segment 6: Use-case 5: Bringing BERT to the browser with a Chrome extension(30 min)

  • We will build a Flask app to house a BERT model and a Chrome extension that will use an API to deliver BERT classifications directly to our browser

Segment 7: Course wrap-up and next steps (20 min)

  • Other BERT use-cases
  • Pulling from HuggingFace’s library of fine-tuned BERT models

Your Instructor

  • Sinan Ozdemir

    Sinan Ozdemir is founder and CTO of LoopGenius, where he uses state-of-the-art AI to help people create and run their businesses. He has lectured in data science at Johns Hopkins University and authored multiple books, videos and numerous online courses on data science, machine learning, and generative AI. He also founded the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. Sinan most recently published Quick Guide to Large Language Models, and launched a podcast audio series, AI Unveiled. Ozdemir holds a master’s degree in pure mathematics from Johns Hopkins University.

    linkedinXlinksearch