StableLM-2-12B
A decoding language model with 12.1 billion parameters
CommonProductProductivityLanguage ModelText Generation
Stable LM 2 12B is a decoding language model with 12.1 billion parameters, which has been pre-trained on a multilingual and code dataset containing 2 trillion tokens. This model can serve as a base for fine-tuning downstream tasks, but it requires assessment and fine-tuning before deployment to ensure safe and reliable performance. The model may contain inappropriate content, and caution is advised when using it, especially in applications that may cause harm to others.
StableLM-2-12B Visit Over Time
Monthly Visits
20899836
Bounce Rate
46.04%
Page per Visit
5.2
Visit Duration
00:04:57