How do Prompt Tools Work?
Creating a prompt tool involves developing a function tailored to a specific task, then making it accessible to LLM models by exposing it as a prompt tool. This allows you to mimic and test an agentic flow.How to create a prompt tool?
For creating a prompt tool:Create a Code-Based Tool
Code editor interface
The interface provides:- A code editor for writing your function
- An input panel on the right for testing
- A console at the bottom to view outputs
Example: Travel price calculator
Here’s an example of a prompt tool that calculates travel fares between cities:Create a Schema-Based Tool
Overview
Schema-based prompt tools provide a structured way to define tools that ensure accurate and schema-compliant outputs. This approach is particularly useful when you need to guarantee that the LLM’s responses follow a specific format.Creating a Schema Tool
Define your schema
Define your schema in the editor. Here’s an example schema for a stock price tool:
Function call schema
Testing Your Schema Tool
After creating your schema-based tool:- Add it to a prompt configuration
- Test if the model correctly identifies when to use it
- Verify that the outputs match your schema’s structure
Create an API-Based Tool
Overview
Maxim allows you to expose external API endpoints as prompt tools. The platform automatically generates function schemas based on the API’s query parameters and payload structure.Example
Here’s how an API payload gets converted into a function schema:- Original API Payload:
Zipcode API payload
- Generated Schema for LLM:
Payload sent to the model while making requests
Define Tool Variables
You can define variables for your prompt tools, which are automatically translated into properties within the function schema exposed to the LLM. The LLM uses these properties to decide the arguments for a tool/function call.Variable Configuration:
Type: Variables can be set as either string or number. Description: Add a description for the variable to help the LLM understand its purpose. Optionality: You can designate variables as optional or non-optional (required). The LLM uses this information, along with the user’s prompt, to determine if it should include the variable in its function call. You can add variables to your prompt tools by following the steps below:Add variables to Code-Based Tool
Add a description for the variable, select the type of the variable, and select the optionality of the variable.
Add variables to Schema-Based Tool
Add variables to API-Based Tool
Add a description for the variable, select the type of the variable, and select the optionality of the variable.