Stable Code 3B
Stable Code 3B - A pre-trained language model for text generation
CommonProductProgrammingText GenerationProgramming
Stable Code 3B is a decoder-only language model with 2.7 billion parameters, pre-trained on 130 billion diverse text and code data points. Stable Code 3B was trained on 18 programming languages and demonstrated state-of-the-art performance compared to models of similar size on various programming languages when evaluated using the BigCode benchmark. It supports long contexts, trained with a sequence length of 16,384, and incorporates the Fill-in-the-Middle (FIM) technique. Users can begin using Stable Code 3B for text generation via code snippets available on the Hugging Face website. Developed by Stability AI based on the GPT-NeoX library, this model is applicable to both English and programming languages.
Stable Code 3B Visit Over Time
Monthly Visits
17788201
Bounce Rate
44.87%
Page per Visit
5.4
Visit Duration
00:05:32