LangChain Integration with Maxim
LangChain is a powerful framework for developing applications powered by language models. This comprehensive guide shows you how to integrate Maxim’s observability capabilities with your LangChain applications in TypeScript/JavaScript.What You’ll Get
With Maxim’s LangChain integration, you can automatically track:- 🔍 LLM Calls: All interactions with language models including prompts, responses, and metadata
- ⛓️ Chain Executions: Complex workflows and their execution flows
- 🛠️ Tool Calls: Function calls and their results
- 📚 Retrievals: Vector store searches and document retrievals
- ❌ Errors: Failed operations with detailed error information
- 📊 Performance: Latency, token usage, and costs
Prerequisites
Before getting started, make sure you have:- Node.js 16+ installed
- A Maxim account with API access
- Your preferred LLM provider API keys (OpenAI, Anthropic, etc.)
Installation
Install the required packages:Environment Setup
Create a.env file in your project root:
.env
Quick Start
Here’s a minimal example to get you started:quickstart.ts - Basic LangChain Integration
Core Integration Patterns
1. Runtime Integration
Add tracing to individual calls:Runtime Integration Pattern
2. Permanent Integration
Attach the tracer to chains permanently:Permanent Integration Pattern
Basic Example
Simple chat model with tracing:basic-example.ts - Simple Chat Model
Custom Metadata
Customize how your operations appear in Maxim by providing metadata:Trace-Level Metadata
Trace-Level Metadata Configuration
Component-Specific Metadata
Component-Specific Metadata Examples
Error Handling
The tracer automatically captures and logs all errors from LangChain operations. No additional error handling code is required - simply use the tracer and all failures will be tracked with full context and stack traces.Supported Providers
The tracer automatically detects and supports major LLM providers:- OpenAI (including Azure OpenAI)
- Anthropic
- Google (Vertex AI, Gemini)
- Amazon Bedrock
- Hugging Face
- Together AI
- Groq
- Local models
Best Practices
1. Meaningful Names and Tags
Best Practice - Meaningful Metadata
2. Session Management
Best Practice - Session Management
3. Environment-Specific Tagging
Best Practice - Environment Tagging
4. Cleanup
Critical - Resource Cleanup
Troubleshooting
Common Issues
1. Missing API Keys or API key not found Solution: Ensure all required environment variables are set. 2. Import Error for @langchain/core Solution: Install the required LangChain packages. 3. Tracer Not Working or No Traces Appearing on Maxim Solution: Verify yourMAXIM_LOG_REPO_ID is correct and the tracer is properly passed to callbacks.
Complete Example: Calculator Tool Chain
Here’s a comprehensive example demonstrating a complete tool calling workflow that executes tools and gets the final response:calculator-chain.ts - Complete Tool Chain Example
Next Steps
- Explore LangGraph integration for complex agent workflows
- Check out the API reference for detailed documentation
- Learn about evaluation workflows to assess your LLM applications
- Set up dashboards to monitor your applications in production