Qwen2.5-Coder-7B
A 7 billion parameter code generation model from the Qwen2.5-Coder series.
CommonProductProgrammingCode GenerationCode Reasoning
Qwen2.5-Coder-7B is a large language model based on Qwen2.5, focusing on code generation, reasoning, and correction. It has been trained on 5.5 trillion tokens, including source code, textual code grounding, synthetic data, etc., representing the latest advancements in open-source code language models. This model not only matches GPT-4o in programming capabilities but also retains advantages in mathematics and general skills, supporting long contexts of up to 128K tokens.
Qwen2.5-Coder-7B Visit Over Time
Monthly Visits
20899836
Bounce Rate
46.04%
Page per Visit
5.2
Visit Duration
00:04:57