Try Bifrost Enterprise free for 14 days.
Request access
[ MODEL COMPARISON ]

Compare codegeex4 with other models

Select another model to compare pricing, limits, and capabilities with codegeex4.

Ollama logo
VS
Models
Ollama logocodegeex4
ollama
Context Length
33K
Max Output
8K
Mode
Chat
Max Input Tokens
33K
Max Tokens
8K
Provider
Ollama
[ WE'RE OPEN SOURCE ]

Scale with the Fastest LLM Gateway

Built for enterprise-grade reliability, governance, and scale. Deploy in seconds.

Comparison Insights

Comprehensive analysis based on the latest model metadata from the comparison table above.

What should I know about codegeex4?

Overview

  • codegeex4 is a chat model provided by Ollama.
  • The model supports a 33K-token context window, suitable for moderate-sized documents and multi-turn conversations.

Pricing

  • Input processing costs $0.00 per million tokens.
  • Output generation costs $0.00 per million tokens.

Output Capabilities

  • The model can generate up to 8K tokens in a single response.