Qwen1.5-MoE-A2.7B
A large-scale MoE (Mixture of Experts) language model whose performance rivals that of 70 billion parameter models.
Qwen1.5-MoE-A2.7B Visit Over Time
Monthly Visits
4314278
Bounce Rate
68.45%
Page per Visit
1.7
Visit Duration
00:01:08