Prerequisites
- Node.js 18+
- OpenAI TypeScript SDK (
npm install openai) - Maxim TypeScript SDK (
npm install @maximai/maxim-js) - API keys for OpenAI and Maxim
Environment Variables
Initialize Maxim Logger
Wrap OpenAI Client
UseMaximOpenAIClient to wrap your OpenAI client for automatic tracing:
Basic Usage
Make responses API requests using the wrapped client:Tool Calling Example
Here’s an example demonstrating tool calls with the Responses API:Responses API vs Chat Completions
The Responses API is a newer, simpler interface compared to Chat Completions:| Feature | Responses API | Chat Completions |
|---|---|---|
| Input format | Single input string | Array of messages |
| Output format | Array of output items | Choices with messages |
| Tool calls | Inline in output array | Nested in message |
| Conversation | Built-in context management | Manual message handling |
Custom Headers for Tracing
You can pass custom headers to enrich your traces:Available Header Options
| Header | Type | Description |
|---|---|---|
maxim-trace-id | string | Link this generation to an existing trace |
maxim-session-id | string | Link the parent trace to an existing session |
maxim-trace-tags | string (JSON) | Custom tags for the trace (e.g., '{"env": "prod"}') |
maxim-generation-name | string | Custom name for the generation in the dashboard |
Streaming Support
The wrapped client supports streaming responses:Cleanup
Always clean up resources when done:What gets logged to Maxim
- Request Details: Model name, parameters, and all other settings
- Message History: Complete conversation history including user messages and assistant responses
- Response Content: Full assistant responses and metadata
- Usage Statistics: Input tokens, output tokens, total tokens consumed
- Error Handling: Any errors that occur during the request

Resources
OpenAI Responses API
Official OpenAI Responses API documentation
Maxim JS SDK
Maxim TypeScript SDK on npm