On January 16, 2024, MiniMax released China's first MoE large language model, abab6. The MoE architecture enables abab6 to handle complex tasks and train more data in a shorter time. abab6 outperforms the previous version abab5.5 in instruction adherence, Chinese comprehension, and English comprehension. abab6 demonstrates excellent capabilities, such as teaching children math problems and helping to build a fictional board game about Shanghai.