Complete guide to using Bifrost as a drop-in replacement for OpenAI API with full compatibility and enhanced features.
POST /openai/v1/chat/completions
🔄 Provider Flexibility: While using OpenAI SDK format, you can specify any model like"anthropic/claude-3-sonnet-20240229"
or"openai/gpt-4o-mini"
- Bifrost will route to the appropriate provider automatically.
Feature | Status | Notes |
---|---|---|
Chat Completions | ✅ Full | All parameters supported |
Function Calling | ✅ Full | Original + MCP tools |
Vision/Multimodal | ✅ Full | Images, documents, etc. |
System Messages | ✅ Full | All message types |
Temperature/Top-p | ✅ Full | All sampling parameters |
Stop Sequences | ✅ Full | Custom stop tokens |
Max Tokens | ✅ Full | Token limit control |
Presence/Frequency Penalty | ✅ Full | Repetition control |
Feature | Enhancement | Benefit |
---|---|---|
Multi-provider Fallbacks | Automatic failover | Higher reliability |
MCP Tool Integration | External tools available | Extended capabilities |
Load Balancing | Multiple API keys | Better performance |
Monitoring | Prometheus metrics | Observability |
Rate Limiting | Built-in throttling | Cost control |