Chapter 3. Under the Hood: Knative

In Chapter 2, we kicked the tires with Knative by using the kn quickstart command, but our focus was on getting a running Knative cluster, not on how the tool works. While the goal of serverless infrastructure is to insulate you from managing the day-to-day details of the underlying systems, it’s helpful to understand the general request flows and patterns to assess whether a particular pattern works well or badly.

While different infrastructure providers implement specific serverless details differently, many of the main platforms implement fairly similar functionality for serverless compute and event distribution. In this chapter, we’ll focus on Knative Serving as an example of the former and Knative Eventing as an example of the latter. Knative is a popular open source serverless platform implemented on Kubernetes; it’s fairly easy to install and can scale horizontally for both throughput and reliability.

Many of the principles in the open source software apply directly to hosted services offered by service providers; I’ll also call out particular distinctions with popular hosted platforms where appropriate. One pattern that is somewhat unique to Knative is that the Serving and Eventing projects are independent: you can use just Knative Serving to implement a REST API (for example), or use only Knative Eventing to deliver events to traditional infrastructure components without Serving installed.

Focusing on the open source software also allows ...

Get Building Serverless Applications on Knative now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.