MoE 8x7B
MistralAI's new 8x7B mixed-expert (MoE) base model for text generation.
CommonProductWritingText GenerationMixed Expert
MistralAI's new 8x7B mixed-expert (MoE) base model for text generation. This model utilizes a mixed expert architecture to produce high-quality text. Its advantages include generating high-quality text and being applicable to various text generation tasks. Pricing is determined based on usage; please refer to the official website for details. This model aims to address challenges in text generation tasks.
MoE 8x7B Visit Over Time
Monthly Visits
1920278
Bounce Rate
40.85%
Page per Visit
4.8
Visit Duration
00:04:46