StarCoder2
Large-scale code generation pretraining model
CommonProductProgrammingTransformerCode Generation
StarCoder2 is a 150 billion parameter Transformer model pretrained on over 600 programming language datasets, including GitHub. It utilizes technologies such as Grouped Query Attention. This model is suitable for code generation tasks and supports multiple programming languages.
StarCoder2 Visit Over Time
Monthly Visits
20899836
Bounce Rate
46.04%
Page per Visit
5.2
Visit Duration
00:04:57