Stable Code 3B

Stable Code 3B - A pre-trained language model for text generation

CommonProductProgrammingText GenerationProgramming
Stable Code 3B is a decoder-only language model with 2.7 billion parameters, pre-trained on 130 billion diverse text and code data points. Stable Code 3B was trained on 18 programming languages and demonstrated state-of-the-art performance compared to models of similar size on various programming languages when evaluated using the BigCode benchmark. It supports long contexts, trained with a sequence length of 16,384, and incorporates the Fill-in-the-Middle (FIM) technique. Users can begin using Stable Code 3B for text generation via code snippets available on the Hugging Face website. Developed by Stability AI based on the GPT-NeoX library, this model is applicable to both English and programming languages.
Visit

Stable Code 3B Visit Over Time

Monthly Visits

19075321

Bounce Rate

45.07%

Page per Visit

5.5

Visit Duration

00:05:32

Stable Code 3B Visit Trend

Stable Code 3B Visit Geography

Stable Code 3B Traffic Sources

Stable Code 3B Alternatives