- Track LLM interactions - Monitor all calls to language models
- Analyze performance - Measure latency, token usage, and costs
- Debug chains - Visualize the flow of information through your LangChain applications
- Evaluate outputs - Assess the quality of responses from your LLM chains
Requirements
Env variables
Initialize logger
Initialize MaximLangchainTracer
Make LLM calls using MaximLangchainTracer
