Limit search to available items
10164 results found. Sorted by relevance | date | title .
Book Cover
Streaming video

Title Azure LLMOps
Edition [First edition]
Published [Place of publication not identified] : Pragmatic AI Solutions, 2024

Copies

Description 1 online resource (1 video file (4 hr., 32 min.)) : sound, color
Summary Azure LLMOps Course Overview Welcome to our comprehensive course on building Large Language Model applications on Azure! In this hands-on machine learning introduction, you will gain expertise in leveraging Azure's AI services including Azure OpenAI Service to develop, deploy, and manage powerful LLM-based solutions using Python, Docker, LangChain, Semantic Kernel, and microservices. Get started with an introduction to Azure fundamentals, including the portal, Azure ML, and Azure OpenAI Service. Learn strategies for responsible data grounding and mitigating risks when using LLMs. Discover options for deploying pre-trained models in Azure and optimizing prompts. Dive deeper into production-ready deployment with Azure ML, implementing compute, GPUs, Docker containers, and inference APIs. Extend capabilities by building custom functions and microservices with LangChain. Architect performant applications with retrieval-augmented generation (RAG) and Azure AI Search (formerly known as Cognitive Search). Automate deployments with GitHub Actions scaling through Azure Container Apps. Our project-based approach will empower you to develop industry-leading intelligent applications on Azure's trusted cloud platform. Gain expertise in building the next generation of AI-powered solutions on Azure using Python, Docker, LangChain, and microservices! This course is divided into 4 weeks. Week 1: Introduction to LLMOps with Azure This introductory week provides an introduction to Azure fundamentals, including the portal, AI services, responsible data grounding for LLMs, and deploying initial models. Learning Objectives Describe Azure's core services and tools for AI solutions like Azure ML. Explain how LLMs work, their benefits/risks, and mitigation strategies. Discover and deploy pre-trained LLMs in Azure using responsible data grounding. Week 2: LLMs with Azure This week focuses on production-ready deployment of LLMs using Azure ML and Azure Open AI Service, compute, GPUs, Docker containers, and inference APIs. Learning Objectives Manage compute, GPUs, Docker containers, and quotas with Azure ML. Deploy LLMs and leverage inference APIs in Azure ML and Azure OpenAI. Monitor usage, manage keys and endpoints, and clean up resources properly. Week 3: Extending with Functions and Plugins This week covers optimizing prompts with Semantic Kernel and extending capabilities via custom functions and microservices with LangChain. Learning Objectives Create advanced prompts using Semantic Kernel for nuanced LLM interactions. Build custom functions and microservices with LangChain to extend system capabilities. Implement functions with external APIs to customize model behavior. Week 4: Building an end-to-end LLM application on Azure This week explores architectural patterns, RAG with Azure Cognitive Search, and automated deployments using GitHub Actions. Learning Objectives Architect LLM apps using retrieval-augmented generation (RAG). Create search indexes/embeddings in Azure Cognitive Search to power RAG. Automate deployments and testing using GitHub Actions workflows. Deploy an end-to-end LLM application on Azure leveraging RAG and GitHub Actions. About your instructor Alfredo Deza has over a decade of experience as a Software Engineer doing DevOps, automation, and scalable system architecture. Before getting into technology he participated in the 2004 Olympic Games and was the first-ever World Champion in High Jump representing Peru. He currently works in Developer Relations at Microsoft and is an Adjunct Professor at Duke University teaching Machine Learning, Cloud Computing, Data Engineering, Python, and Rust. With Alfredo's guidance, you will gain the knowledge and skills to build, deploy, and create Large Language Model applications with the Azure cloud. Resources Hugging Face for MLOps Doing MLOps with Databricks and MLFlow
Performer Alfredo Deza, instructor
Notes Online resource; title from title details screen (O'Reilly, viewed January 23, 2024)
SUBJECT Windows Azure. http://id.loc.gov/authorities/names/n2010028313
Subject Cloud computing.
Application software -- Development.
Genre/Form Instructional films.
Nonfiction films.
Internet videos.
Films de formation.
Films autres que de fiction.
Vidéos sur Internet.
Form Streaming video
Author Deza, Alfredo, instructor.
Pragmatic AI Solutions (Firm), publisher
Other Titles Azure large language model applications