Qwen1.5-32B
A series of Transformer-based pre-trained language models
CommonProductProductivityPre-trained modelTransformer
Qwen1.5 is a decoder language model series based on the Transformer architecture, including models of various sizes. It features SwiGLU activation, attention QKV bias, and group query attention. It supports multiple natural languages and code. Fine-tuning is recommended, such as SFT, RLHF, etc. Pricing is free.
Qwen1.5-32B Visit Over Time
Monthly Visits
19075321
Bounce Rate
45.07%
Page per Visit
5.5
Visit Duration
00:05:32