Qwen2.5-Coder-3B-Instruct-GPTQ-Int4

3B parameter instruction-tuned model within the Qwen2.5-Coder series

CommonProductProgrammingCode GenerationCode Inference
Qwen2.5-Coder is the latest series in the Qwen large language model, specifically designed for code generation, inference, and debugging. The model is based on Qwen2.5, extending the training tokens to 5.5 trillion, incorporating source code, textual code foundations, synthetic data, etc. Qwen2.5-Coder-32B stands out as a top performer among open-source code LLMs, matching the encoding capabilities of GPT-4o. This model is a GPTQ-quantized 4-bit instruction-tuned 3B parameter Qwen2.5-Coder model, featuring causal language modeling, pre-training and post-training phases, and a transformers architecture.
Visit

Qwen2.5-Coder-3B-Instruct-GPTQ-Int4 Visit Over Time

Monthly Visits

19075321

Bounce Rate

45.07%

Page per Visit

5.5

Visit Duration

00:05:32

Qwen2.5-Coder-3B-Instruct-GPTQ-Int4 Visit Trend

Qwen2.5-Coder-3B-Instruct-GPTQ-Int4 Visit Geography

Qwen2.5-Coder-3B-Instruct-GPTQ-Int4 Traffic Sources

Qwen2.5-Coder-3B-Instruct-GPTQ-Int4 Alternatives