Mixtral-8x22B

A large language model based on a sparse expert framework.

CommonProductProgrammingLanguage ModelText Generation
Mixtral-8x22B is a pre-trained generative sparse expert language model developed by the Mistral AI team, aiming to advance the open development of artificial intelligence. With 141B parameters, it supports various optimization deployment methods, such as half-precision and quantization, to meet the needs of different hardware and application scenarios. Mixtral-8x22B can be used for text generation, question answering, and translation tasks in natural language processing.
Visit

Mixtral-8x22B Visit Over Time

Monthly Visits

17104189

Bounce Rate

44.67%

Page per Visit

5.5

Visit Duration

00:05:49

Mixtral-8x22B Visit Trend

Mixtral-8x22B Visit Geography

Mixtral-8x22B Traffic Sources

Mixtral-8x22B Alternatives