Chapter 2. Differential Privacy Fundamentals

This chapter introduces the fundamentals of differential privacy. Take your time getting familiar with the definitions and ideas in this and the next two chapters, as the framework covered in these chapters forms the structure of nearly all differentially private algorithms.

This chapter builds on the scenario from Chapter 1, where a professor releases mean queries about the test results of a class of students.

By the end of this chapter, you will:

  • Understand the basic terminology of differential privacy

  • Understand distance bounds, such as adjacency, sensitivity, and privacy loss parameters

  • Understand how postprocessing affects a DP release

  • Understand how DP mechanisms compose, both sequentially and in parallel

  • Understand the local and central models of privacy

  • Be able to execute DP queries with the SmartNoise Library

Note

There is a saying in software development: “If it’s not tested, it’s broken.” A similar mantra holds for differential privacy: “If it’s not proven, it’s not private.” In practice, you should not trust an algorithm to be differentially private unless it is accompanied by a proof. For this reason, algorithms in this book are accompanied by proofs—or at minimum, where to find one.

You will gain a deeper understanding of differential privacy if you follow along with the proofs. On the other hand, if you don’t consider yourself a mathematician, don’t fret. Focus on understanding the setup and purpose (what ...

Get Hands-On Differential Privacy now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.