Get Bifrost running as an HTTP API gateway in 30 seconds with zero configuration. Perfect for any programming language.
Flag | Default | NPX | Docker | Description |
---|---|---|---|---|
port | 8080 | -port 8080 | -e APP_PORT=8080 -p 8080:8080 | HTTP server port |
host | localhost | -host 0.0.0.0 | -e APP_HOST=0.0.0.0 | Host to bind server to |
log-level | info | -log-level info | -e LOG_LEVEL=info | Log level (debug, info, warn, error) |
log-style | json | -log-style json | -e LOG_STYLE=json | Log style (pretty, json) |
-app-dir
flag determines where Bifrost stores all its data:
config.json
- Configuration file (optional)config.db
- SQLite database for UI configurationlogs.db
- Request logs database/v1/chat/completions
works with any provider (OpenAI, Anthropic, Bedrock, etc.)openai/gpt-4o-mini
tells Bifrost to use OpenAI’s GPT-4o Mini modelconfig.json
file exists (Bifrost auto-creates SQLite database)config.json
exists with config_store
configuredconfig.json
in your app directory:
config_store
in config.json
:
config.json
is never modifiedconfig.json
only apply after restartconfig_store
in config.json
:
config.json
settings, then uses DB exclusivelyconfig.json
configurationsconfig.json
after initial bootstrap has no effect when config_store
is enabled. Use the public HTTP APIs to make configuration changes instead.
The Three Stores Explained: