Jamba
Breakthrough SSM-Transformer Open Model
EditorRecommendationProductivityLanguage modelLarge-scale corpus
Jamba is an open-weights language model based on the hybrid SSM-Transformer architecture, delivering top-tier quality and performance. It combines the strengths of Transformer and SSM architectures, achieving outstanding results in inference benchmarks while providing a 3x throughput increase in long-context scenarios. Jamba is currently the only model of this scale that can support a 140,000-character context on a single GPU, offering exceptional cost-effectiveness. As a foundational model, Jamba is designed for developers to fine-tune, train, and build customized solutions.
Jamba Visit Over Time
Monthly Visits
87280
Bounce Rate
48.13%
Page per Visit
2.3
Visit Duration
00:00:39