Generative AI in Production (GAIP)

 

Course Overview

In this course, you learn about the different challenges that arise when productionizing generative AI-powered applications versus traditional ML. You will learn how to manage experimentation and tuning of your LLMs, then you will discuss how to deploy, test, and maintain your LLM-powered applications. Finally, you will discuss best practices for logging and monitoring your LLM-powered applications in production.

Who should attend

Developers and machine learning engineers who wish to operationalize Gen AI-based applications

Prerequisites

Course Objectives

  • Describe the challenges in productionizing applications using generative AI.
  • Manage experimentation and evaluation for LLM-powered applications.
  • Productionize LLM-powered applications.
  • Implement logging and monitoring for LLM-powered applications.

Outline: Generative AI in Production (GAIP)

Module 1 - Introduction to Generative AI in Production

Topics:

  • AI System Demo: Coffee on Wheels
  • Traditional MLOps vs. GenAIOps
  • Generative AI Operations
  • Components of an LLM System

Objectives:

  • Understand generative AI operations
  • Compare traditional MLOps and GenAIOps
  • Analyze the components of an LLM system

Module 2 - Managing Experimentation

Topics:

  • Datasets and Prompt Engineering
  • RAG and ReACT Architecture
  • LLM Model Evaluation (metrics and framework)
  • Tracking Experiments

Objectives:

  • Experiment with datasets and prompt engineering.
  • Utilize RAG and ReACT architecture.
  • Evaluate LLM models.
  • Track experiments.

Activities:

  • Lab: Unit Testing Generative AI Applications
  • Optional Lab: Generative AI with Vertex AI: Prompt Design

Module 3 - Productionizing Generative AI

Topics:

  • Deployment, packaging, and versioning (GenAIOps)
  • Testing LLM systems (unit and integration)
  • Maintenance and updates (operations)
  • Prompt security and migration

Objectives:

  • Deploy, package, and version models
  • Test LLM systems
  • Maintain and update LLM models
  • Manage prompt security and migration

Activities:

  • Lab: Vertex AI Pipelines: Qwik Start
  • Lab: Safeguarding with Vertex AI Gemini API

Module 4 - Logging and Monitoring for Production LLM Systems

Topics:

  • Cloud Logging
  • Prompt versioning, evaluation, and generalization
  • Monitoring for evaluation-serving skew
  • Continuous validation

Objectives:

  • Utilize Cloud Logging
  • Version, evaluate, and generalize prompts
  • Monitor for evaluation-serving skew
  • Utilize continuous validation

Activities:

  • Lab: Vertex AI: Gemini Evaluations Playbook
  • Optional Lab: Supervised Fine Tuning with Gemini for Question and Answering

Prices & Delivery methods

Online Training

Duration
1 day

Price
  • US $ 595
Classroom Training

Duration
1 day

Price
  • United States: US $ 595

Schedule

Currently there are no training dates scheduled for this course.