Qwen2.5-Coder Technical Report
Qwen2.5-Coder Series Technical Report
CommonProductProgrammingIntelligent CodingPre-trained Models
The Qwen2.5-Coder series consists of code-specific models based on the Qwen2.5 architecture, including Qwen2.5-Coder-1.5B and Qwen2.5-Coder-7B. These models continue to be pre-trained on a massive corpus of over 5.5 trillion tokens, showcasing impressive code generation capabilities while maintaining generality through meticulous data cleaning, scalable synthetic data generation, and balanced data mixing. Qwen2.5-Coder has achieved state-of-the-art performance in over ten benchmark tests across various code-related tasks, including code generation, completion, reasoning, and repair, consistently outperforming larger models of comparable size. The release of this series not only pushes the boundaries of intelligent coding research but also encourages developers to adopt it for real-world applications through its licensing.
Qwen2.5-Coder Technical Report Visit Over Time
Monthly Visits
19075321
Bounce Rate
45.07%
Page per Visit
5.5
Visit Duration
00:05:32