Video description
MLOps packaging: HuggingFace and Docker HubUse automation to package models
Learn how to package a HuggingFace GPT2 model using automation with MLOps and pushing the result to Docker Hub. With just a little bit of Python and FastAPI you can have a powerful text generation API that is self-documented!
Learn ObjectivesIn this video lesson, I'll go over the details with an example repository you can use for reference including the following learning objectives:
- Create a FastAPI application with HuggingFace
- Interact with the model with HTTP from a container using FastAPI
- Containerize the application using GitHub Actions
- Create repository secrets to login and push to Docker Hub
Table of contents
Product information
- Title: MLOps packaging: HuggingFace and Docker
- Author(s):
- Release date: June 2022
- Publisher(s): Pragmatic AI Labs
- ISBN: 50143VIDEOPAIML
You might also like
video
Packaging Machine Learning Models with Docker
One of the important aspects of MLOps, also known as Machine Learning Operations or Operationalizing Machine …
video
MLOps packaging: HuggingFace and Azure Container Registry
MLOps packaging: HuggingFace and Azure Container Registry Use automation to package models Learn how to package …
video
Quickstart Python Fire Command Line Interface (CLI)
Learn to build a Python Fire Command Line Interface (CLI) 00:00 Intro 00:27 Build Function to …
article
Run Llama-2 Models Locally with llama.cpp
Llama is Meta’s answer to the growing demand for LLMs. Unlike its well-known technological relative, ChatGPT, …