POST
/
v1
/
prompts
/
run
curl --request POST \
  --url https://api.getmaxim.ai/v1/prompts/run \
  --header 'Content-Type: application/json' \
  --header 'x-maxim-api-key: <api-key>' \
  --data '{
  "promptId": "<string>",
  "versionId": "<string>",
  "workspaceId": "<string>",
  "messages": [
    {
      "role": "assistant",
      "content": "<string>"
    }
  ],
  "modelName": "<string>",
  "modelProvider": "openai",
  "modelParameters": {
    "tools": [
      {
        "type": "function",
        "function": {
          "name": "<string>",
          "description": "<string>",
          "parameters": {
            "type": "<string>",
            "properties": {},
            "description": "<string>",
            "items": "<any>",
            "required": [
              "<string>"
            ]
          }
        }
      }
    ]
  }
}'
{
  "data": {
    "output": "<any>",
    "usage": {
      "totalTokens": 123,
      "promptTokens": 123,
      "completionTokens": 123
    }
  }
}

Authorizations

x-maxim-api-key
string
header
required

API key for authentication

Body

application/json
promptId
string
required

Unique identifier for the prompt

versionId
string
required

Unique identifier for the version

workspaceId
string
required

Unique identifier for the workspace

messages
object[]
required

Array of messages

Message object

modelName
string
required

Name of the model to use

modelProvider
enum<string>
required

Provider of the model

Available options:
openai,
azure,
huggingface,
anthropic,
together,
google,
groq,
bedrock,
maxim,
cohere,
ollama,
lmstudio,
xai
modelParameters
object

Model parameters configuration

Response

200
application/json
Prompt version executed successfully
data
object
required