Qwen1.5-32B

A series of Transformer-based pre-trained language models

CommonProductProductivityPre-trained modelTransformer
Qwen1.5 is a decoder language model series based on the Transformer architecture, including models of various sizes. It features SwiGLU activation, attention QKV bias, and group query attention. It supports multiple natural languages and code. Fine-tuning is recommended, such as SFT, RLHF, etc. Pricing is free.
Visit

Qwen1.5-32B Visit Over Time

Monthly Visits

17788201

Bounce Rate

44.87%

Page per Visit

5.4

Visit Duration

00:05:32

Qwen1.5-32B Visit Trend

Qwen1.5-32B Visit Geography

Qwen1.5-32B Traffic Sources

Qwen1.5-32B Alternatives