Overview
Bifrost provides 100% OpenAI API compatibility with enhanced features:- Zero code changes - Works with existing OpenAI SDK applications
- Same request/response formats - Exact OpenAI API specification
- Enhanced capabilities - Multi-provider fallbacks, MCP tools, monitoring
- All endpoints supported - Chat completions, text completions, function calling
- Any provider under the hood - Use any configured provider (OpenAI, Anthropic, etc.)
POST /openai/v1/chat/completions
Provider Flexibility: While using OpenAI SDK format, you can specify any model like"anthropic/claude-3-sonnet-20240229"or"openai/gpt-4o-mini"- Bifrost will route to the appropriate provider automatically.
Quick Migration
Python (OpenAI SDK)
JavaScript (OpenAI SDK)
Supported Features
Fully Supported
| Feature | Status | Notes |
|---|---|---|
| Chat Completions | ✅ Full | All parameters supported |
| Function Calling | ✅ Full | Original + MCP tools |
| Vision/Multimodal | ✅ Full | Images, documents, etc. |
| System Messages | ✅ Full | All message types |
| Temperature/Top-p | ✅ Full | All sampling parameters |
| Stop Sequences | ✅ Full | Custom stop tokens |
| Max Tokens | ✅ Full | Token limit control |
| Presence/Frequency Penalty | ✅ Full | Repetition control |
Enhanced Features
| Feature | Enhancement | Benefit |
|---|---|---|
| Multi-provider Fallbacks | Automatic failover | Higher reliability |
| MCP Tool Integration | External tools available | Extended capabilities |
| Load Balancing | Multiple API keys | Better performance |
| Monitoring | Prometheus metrics | Observability |
| Rate Limiting | Built-in throttling | Cost control |