StableLM-2-12B

A decoding language model with 12.1 billion parameters

CommonProductProductivityLanguage ModelText Generation
Stable LM 2 12B is a decoding language model with 12.1 billion parameters, which has been pre-trained on a multilingual and code dataset containing 2 trillion tokens. This model can serve as a base for fine-tuning downstream tasks, but it requires assessment and fine-tuning before deployment to ensure safe and reliable performance. The model may contain inappropriate content, and caution is advised when using it, especially in applications that may cause harm to others.
Visit

StableLM-2-12B Visit Over Time

Monthly Visits

17104189

Bounce Rate

44.67%

Page per Visit

5.5

Visit Duration

00:05:49

StableLM-2-12B Visit Trend

StableLM-2-12B Visit Geography

StableLM-2-12B Traffic Sources

StableLM-2-12B Alternatives