MiniMax released the first MoE large language model in China, abab6, on January 16, 2024. This model employs an MoE architecture and is capable of handling complex tasks, while also being able to train more data within a unit of time. Evaluation results show that abab6 outperforms its previous version, abab5.5, in instruction compliance, Chinese comprehensive abilities, and English comprehensive abilities, surpassing other large language models like GPT-3.5. abab6 demonstrates remarkable capabilities, such as teaching children math problems and assisting in creating a fictional board game about Shanghai. As the first MoE large language model in China, abab6 excels in handling complex tasks.
MiniMax Launches China's First MoE Large Language Model abab6
站长之家
231
© Copyright AIbase Base 2024, Click to View Source - https://www.aibase.com/news/4893