StarCoder2
Large-scale code generation pretraining model
CommonProductProgrammingTransformerCode Generation
StarCoder2 is a 150 billion parameter Transformer model pretrained on over 600 programming language datasets, including GitHub. It utilizes technologies such as Grouped Query Attention. This model is suitable for code generation tasks and supports multiple programming languages.
StarCoder2 Visit Over Time
Monthly Visits
19075321
Bounce Rate
45.07%
Page per Visit
5.5
Visit Duration
00:05:32