Azure LLMOps

Video description

Azure LLMOps

Course Overview

Welcome to our comprehensive course on building Large Language Model applications on Azure! In this hands-on machine learning introduction, you will gain expertise in leveraging Azure's AI services including Azure OpenAI Service to develop, deploy, and manage powerful LLM-based solutions using Python, Docker, LangChain, Semantic Kernel, and microservices.

Get started with an introduction to Azure fundamentals, including the portal, Azure ML, and Azure OpenAI Service. Learn strategies for responsible data grounding and mitigating risks when using LLMs. Discover options for deploying pre-trained models in Azure and optimizing prompts.

Dive deeper into production-ready deployment with Azure ML, implementing compute, GPUs, Docker containers, and inference APIs. Extend capabilities by building custom functions and microservices with LangChain. Architect performant applications with retrieval-augmented generation (RAG) and Azure AI Search (formerly known as Cognitive Search).

Automate deployments with GitHub Actions scaling through Azure Container Apps. Our project-based approach will empower you to develop industry-leading intelligent applications on Azure's trusted cloud platform.

Gain expertise in building the next generation of AI-powered solutions on Azure using Python, Docker, LangChain, and microservices!

This course is divided into 4 weeks.

Week 1: Introduction to LLMOps with Azure

This introductory week provides an introduction to Azure fundamentals, including the portal, AI services, responsible data grounding for LLMs, and deploying initial models.

Learning Objectives

  • Describe Azure's core services and tools for AI solutions like Azure ML.
  • Explain how LLMs work, their benefits/risks, and mitigation strategies.
  • Discover and deploy pre-trained LLMs in Azure using responsible data grounding.

Week 2: LLMs with Azure

This week focuses on production-ready deployment of LLMs using Azure ML and Azure Open AI Service, compute, GPUs, Docker containers, and inference APIs.

Learning Objectives

  • Manage compute, GPUs, Docker containers, and quotas with Azure ML.
  • Deploy LLMs and leverage inference APIs in Azure ML and Azure OpenAI.
  • Monitor usage, manage keys and endpoints, and clean up resources properly.

Week 3: Extending with Functions and Plugins

This week covers optimizing prompts with Semantic Kernel and extending capabilities via custom functions and microservices with LangChain.

Learning Objectives

  • Create advanced prompts using Semantic Kernel for nuanced LLM interactions.
  • Build custom functions and microservices with LangChain to extend system capabilities.
  • Implement functions with external APIs to customize model behavior.

Week 4: Building an end-to-end LLM application on Azure

This week explores architectural patterns, RAG with Azure Cognitive Search, and automated deployments using GitHub Actions.

Learning Objectives

  • Architect LLM apps using retrieval-augmented generation (RAG).
  • Create search indexes/embeddings in Azure Cognitive Search to power RAG.
  • Automate deployments and testing using GitHub Actions workflows.
  • Deploy an end-to-end LLM application on Azure leveraging RAG and GitHub Actions.

About your instructor

Alfredo Deza has over a decade of experience as a Software Engineer doing DevOps, automation, and scalable system architecture. Before getting into technology he participated in the 2004 Olympic Games and was the first-ever World Champion in High Jump representing Peru. He currently works in Developer Relations at Microsoft and is an Adjunct Professor at Duke University teaching Machine Learning, Cloud Computing, Data Engineering, Python, and Rust. With Alfredo's guidance, you will gain the knowledge and skills to build, deploy, and create Large Language Model applications with the Azure cloud.

Resources

Table of contents

  1. Lesson 1
    1. "Meet Your Course Instructor"
    2. "About This Course"
    3. "Introduction"
    4. "Introduction To The Azure Portal"
    5. "Using Microsoft Learn"
    6. "Identifying Azure Ai Solutions"
    7. "Introduction To Azure Machine Learning"
    8. "Introduction To Azure Openai Service"
    9. "Summary"
    10. "Introduction"
    11. "What Are Llms And How Do They Work"
    12. "Benefits And Risks Of Llms"
    13. "Mitigating Risks Of Llms"
    14. "Introduction To Llmops"
    15. "Summary"
    16. "Introduction"
    17. "Discover And Evaluate Llms In Azure"
    18. "Deployment Options For Inferencing"
    19. "What Is Azure Ai Content Safety"
    20. "Azure Machine Learning Differences With Azure Open Ai Service"
    21. "Summary"
  2. Lesson 2
    1. "Introduction"
    2. "Gpu Quotas And Availability"
    3. "Creating A Compute Instance"
    4. "Deploying The Model"
    5. "Using The Inference Api"
    6. "Summary"
    7. "Introduction"
    8. "Getting Access To Azure Oepnai Service"
    9. "Creating An Azure Openai Resource"
    10. "Deploy An Openai Model"
    11. "Using The Playground"
    12. "Summary"
    13. "Introduction"
    14. "Using Keys And Endpoints"
    15. "Creating A Simple Python Example"
    16. "Reviewing Usage And Quotas"
    17. "Cleaning Up Resources"
    18. "Summary"
  3. Lesson 3
    1. "Introduction"
    2. "What Is Semantic Kernel"
    3. "Using Semantic Kernel With Azure"
    4. "Using A System Prompt"
    5. "Advanced System Prompts"
    6. "Summary"
    7. "Introduction"
    8. "Overview Of Functions"
    9. "Defining Functions"
    10. "Using The Function With The Llm"
    11. "Working With Errors"
    12. "Summary"
    13. "Introduction"
    14. "Creating A Glue Function"
    15. "Consuming Function Arguments"
    16. "Using A Native Function"
    17. "Overview Of A Microservice Api"
    18. "Using An External Microservice Api"
    19. "Summary"
  4. Lesson 4
    1. "Introduction"
    2. "Architectural Overview"
    3. "What Is Retrieval Augmented Generation"
    4. "Overview Of Azure Ai Search"
    5. "Automation And Deployment With Github Actions"
    6. "Summary"
    7. "Introduction"
    8. "Create The Azure Services"
    9. "Create The Embeddings"
    10. "Create And Upload The Index"
    11. "Verifying The Embeddings"
    12. "Using Rag With Azure Openai"
    13. "Summary"
    14. "Introduction"
    15. "Application Overview"
    16. "Setting Up Azure Components"
    17. "Architectural Overview"
    18. "Using Github Actions With Azure"
    19. "Verifying And Troubleshooting Deployments"
    20. "Summary"

Product information

  • Title: Azure LLMOps
  • Author(s): Alfredo Deza
  • Release date: January 2024
  • Publisher(s): Pragmatic AI Labs
  • ISBN: 28405272VIDEOPAIML