Qwen2.5-Coder-7B

A 7 billion parameter code generation model from the Qwen2.5-Coder series.

CommonProductProgrammingCode GenerationCode Reasoning
Qwen2.5-Coder-7B is a large language model based on Qwen2.5, focusing on code generation, reasoning, and correction. It has been trained on 5.5 trillion tokens, including source code, textual code grounding, synthetic data, etc., representing the latest advancements in open-source code language models. This model not only matches GPT-4o in programming capabilities but also retains advantages in mathematics and general skills, supporting long contexts of up to 128K tokens.
Visit

Qwen2.5-Coder-7B Visit Over Time

Monthly Visits

19075321

Bounce Rate

45.07%

Page per Visit

5.5

Visit Duration

00:05:32

Qwen2.5-Coder-7B Visit Trend

Qwen2.5-Coder-7B Visit Geography

Qwen2.5-Coder-7B Traffic Sources

Qwen2.5-Coder-7B Alternatives