In this cookbook, you’ll learn how to easily integrate Maxim’s powerful tracing into your GenAI app powered by Google Gemini. We’ll walk through a simple example that shows how to set up the integration, trace requests, and log tool calls.

Prerequisites

1. Load Environment Variables

First, load your API keys from a .env file or your environment.
import dotenv
dotenv.load_dotenv()

2. Initialize Maxim and Gemini Clients

Set up the Maxim logger and wrap the Gemini client for tracing.
from maxim import Maxim
from maxim.logger.gemini import MaximGeminiClient

logger = Maxim().logger()

3. Create the Gemini Client with Maxim Tracing

import os
from google import genai

client = MaximGeminiClient(
    client=genai.Client(api_key=os.getenv("GEMINI_API_KEY")),
    logger=logger
)

4. (Optional) Define a Tool Function

You can trace tool calls (function calls) as part of your workflow. For example, a weather function:
def get_current_weather(location: str) -> str:
    """Get the current weather in a given location."""
    print(f"Called with: {location=}")
    return "23C"

5. Generate Content with Tracing

Now, make a request to Gemini and trace it with Maxim. You can also pass tool functions for tool-calling scenarios.
response = client.models.generate_content(
    model="gemini-2.0-flash",
    contents="What's the temp in SF?",
    config={
        "tools": [get_current_weather],
        "system_instruction": "You are a helpful assistant",
        "temperature": 0.8,
    },
)

6. View Traces in Maxim

All requests, responses, and tool calls are now automatically traced and can be viewed in your Maxim dashboard.

Full Example

import dotenv
import os
from maxim import Maxim
from maxim.logger.gemini import MaximGeminiClient
from google import genai

dotenv.load_dotenv()

logger = Maxim().logger()
client = MaximGeminiClient(
    client=genai.Client(api_key=os.getenv("GEMINI_API_KEY")),
    logger=logger
)

def get_current_weather(location: str) -> str:
    print(f"Called with: {location=}")
    return "23C"

response = client.models.generate_content(
    model="gemini-2.0-flash",
    contents="What's the temp in SF?",
    config={
        "tools": [get_current_weather],
        "system_instruction": "You are a helpful assistant",
        "temperature": 0.8,
    },
)
print(response)
Gif
For more details, see the Maxim Python SDK documentation.