LangGraph Integration with Maxim
LangGraph is a library for building stateful, multi-actor applications with language models. This comprehensive guide shows you how to integrate Maxim’s observability capabilities with your LangGraph applications in TypeScript/JavaScript.What You’ll Get
With Maxim’s LangGraph integration, you can automatically track:- 🔍 LLM Calls: All interactions with language models including prompts, responses, and metadata
- 🤖 Agent Executions: Complex agent workflows and their execution flows
- 🛠️ Tool Calls: Function calls and their results
- 📚 Retrievals: Vector store searches and document retrievals
- ❌ Errors: Failed operations with detailed error information
- 📊 Performance: Latency, token usage, and costs
Prerequisites
Before getting started, make sure you have:- Node.js 16+ installed
- A Maxim account with API access
- LangChain and LangGraph packages installed
- External tool API keys (e.g., Tavily for search)
- Your preferred LLM provider API keys (OpenAI, Anthropic, etc.)
- A Tavily API key for search functionality (optional - get one at tavily.com)
Installation
Install the required packages:Environment Setup
Create a.env file in your project root:
.env
Quick Start
Here’s a minimal example to get you started:quickstart.ts - Basic LangGraph Agent
Core Integration Patterns
1. Runtime Integration
Add tracing to individual calls:Runtime Integration Pattern
2. Permanent Integration
Attach the tracer to agents permanently:Permanent Integration Pattern
Basic Example
Simple ReAct agent with Tavily search:basic-agent.ts - ReAct Agent with Search
Custom Metadata
Customize how your operations appear in Maxim by providing metadata:Trace-Level Metadata
Trace-Level Metadata Configuration
Component-Specific Metadata
Component-Specific Metadata Examples
Error Handling
The tracer automatically captures and logs all errors from LangGraph operations. No additional error handling code is required - simply use the tracer and all failures will be tracked with full context and stack traces.Supported Providers
The tracer automatically detects and supports major LLM providers:- OpenAI (including Azure OpenAI)
- Anthropic
- Google (Vertex AI, Gemini)
- Amazon Bedrock
- Hugging Face
- Together AI
- Groq
- Local models
Best Practices
1. Meaningful Names and Tags
Best Practice - Meaningful Metadata
2. Session Management
Best Practice - Session Management
3. Environment-Specific Tagging
Best Practice - Environment Tagging
4. Cleanup
Critical - Resource Cleanup
Troubleshooting
Common Issues
1. Missing API Keys or API key not found Solution: Ensure all required environment variables are set. 2. Import Error for @langchain/core Solution: Install the required LangChain packages. 3. Tracer Not Working or No Traces Appearing on Maxim Solution: Verify yourMAXIM_LOG_REPO_ID is correct and the tracer is properly passed to callbacks.
Complete Example: Multi-Agent Customer Service System
Here’s a comprehensive example demonstrating multiple LangGraph features with detailed tracing:customer-service-agent.ts - Multi-Agent System
Next Steps
- Explore LangChain integration for simpler workflows
- Check out the API reference for detailed documentation
- Learn about evaluation workflows to assess your LLM applications
- Set up dashboards to monitor your applications in production