Chapter 2. Transformers

Many trace the most recent wave of advances in generative AI to the introduction of a class of models called transformers in 2017. Their most well-known application is the powerful Large Language Model (LLM), such as Llama and GPT-4, used by hundreds of millions daily. Transformers have become a backbone for modern AI applications, powering everything from chatbots and search systems to machine translation and content summarization. They’ve even branched out beyond text, making waves in fields like computer vision, music generation, and protein folding. In this chapter, we’ll explore the core ideas behind transformers and how they work, with a focus on one of the most common applications: language modeling.

Before we dive into the nitty-gritty of transformers, let’s take a step back and understand what language modeling is. At its core, a language model (LM) is a probabilistic model that learns to predict ...

Get Hands-On Generative AI with Transformers and Diffusion Models now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.