Mixtral-8x22B
A large language model based on a sparse expert framework.
CommonProductProgrammingLanguage ModelText Generation
Mixtral-8x22B is a pre-trained generative sparse expert language model developed by the Mistral AI team, aiming to advance the open development of artificial intelligence. With 141B parameters, it supports various optimization deployment methods, such as half-precision and quantization, to meet the needs of different hardware and application scenarios. Mixtral-8x22B can be used for text generation, question answering, and translation tasks in natural language processing.
Mixtral-8x22B Visit Over Time
Monthly Visits
19075321
Bounce Rate
45.07%
Page per Visit
5.5
Visit Duration
00:05:32