Introduction
LiveKit is a powerful platform for building real-time video, audio, and data applications. With Maxim’s integration, you can monitor and trace your LiveKit voice agents, capturing detailed insights into conversation flows, function calls, and performance metrics in real-time. This integration allows you to:- Monitor voice agent conversations in real-time
- Trace function tool calls and their performance
- Debug and optimize your voice AI applications
Requirements
Environment Variables
Set up the following environment variables in your.env
file:
Getting Started
Step 1: Obtain API Keys
Maxim API Key
- Sign up at Maxim Console
- Create a new project or select an existing one
- Navigate to API Keys section
- Generate a new API key and copy your
MAXIM_API_KEY
- Go to Logs, create a new repository and copy
MAXIM_LOG_REPO_ID
LiveKit Credentials
- Set up your LiveKit server or use LiveKit Cloud, and create a new project.
- Get your server URL, API key, and API secret from the LiveKit dashboard
- Configure the credentials in your environment variables

OpenAI API Key
- Go to OpenAI Platform & create an API Key OpenAI Platform
- Set the
OPENAI_API_KEY
environment variable
Step 2: Initialize Maxim Logger
instrument_livekit
: This function integrates Maxim’s observability features with LiveKit Agents . It allows you to automatically capture and send trace data to the platform:
logger = Maxim().logger()
: This creates a Maxim logger instance that:
- Connects to your Maxim project using the
MAXIM_API_KEY
andMAXIM_LOG_REPO_ID
environment variables - Handles sending trace data to the Maxim platform
- Provides the logging interface for capturing events, metrics, and traces
on_event
: This is a callback function that gets triggered during trace lifecycle events:
event
: A string indicating what happened ("maxim.trace.started"
or"maxim.trace.ended"
)data
: A dictionary containing trace information:trace_id
: Unique identifier for the tracetrace
: The actual trace object with metadata, timing, and other details
- When a new conversation or interaction starts → logs “Trace started”
- When a conversation or interaction ends → logs “Trace ended”
- Useful for debugging and monitoring trace lifecycle in real-time
Step 3: Create Your Voice Agent
- Inherits from
Agent
: Base class for all LiveKit agents instructions
: System prompt that defines the agent’s personality and capabilities- Tells the agent: It’s a voice assistant that can use web search
@function_tool()
: Decorator that registers this method as a tool the agent can callasync def
: Asynchronous function (required for LiveKit agents)- Type hints:
query: str -> str
helps the AI understand input/output types - Docstring: Describes the function for the AI model
- Current implementation: Placeholder that returns a formatted string
ctx: agents.JobContext
: Contains information about the current job/session
- First tries: Environment variable
LIVEKIT_ROOM_NAME
- Falls back to: Generated name like
assistant-room-a1b2c3d4e5f6...
uuid.uuid4().hex
: Creates a random hexadecimal string
session.start()
: Connects the agent to the roomagent=Assistant()
: Uses your custom Assistant classctx.connect()
: Connects to the LiveKit infrastructuregenerate_reply()
: Makes the agent speak first with a greeting
Complete Example with Web Search
Here’s a complete example that includes web search functionality using Tavily:What Gets Traced
Agent Conversations
Transcript containing System Instructions, User and Assistant Messages are pushed to MaximRunning Your Agent
- Working code is uploaded here - https://github.com/maximhq/maxim-cookbooks/blob/main/python/observability-online-eval/livekit/livekit-gemini.py
- Start your LiveKit server (if self-hosting) or ensure your LiveKit Cloud instance is running
-
Run your agent:
Troubleshooting
Common Issues
Agent not connecting to LiveKit:- Verify your LiveKit credentials are correct
- Check that your LiveKit server is accessible
- Confirm your
MAXIM_API_KEY
andMAXIM_LOG_REPO_ID
are set correctly - Check the Maxim console for any API errors