Recently, Moore Threads and the comprehensive subject educational AI large model "ShiZhe AI" jointly announced that they have completed large model training tests. Relying on Moore Threads' Kua e (KUAE) thousand-card intelligent computing cluster, ShiZhe AI successfully completed the high-intensity training of a 7 billion parameter large model in one week, achieving expected training efficiency and fully demonstrating the capabilities of the domestic full-function GPU thousand-card billion-training platform.

Robot Artificial Intelligence AI

Image source note: The image was generated by AI, and the image authorization service provider is Midjourney.

It is understood that "ShiZhe AI" was established in 2020, with the core team from Tsinghua University, focusing on large models for comprehensive subject education. Since opening for internal testing, it has accumulated over 25,000 users, supporting knowledge in more than 30 subjects, covering over 2,000 textbooks.

This training test successfully verified the powerful performance of Moore Threads' Kua e thousand-card intelligent computing cluster in large model training, laying a foundation for future innovations in educational AI large models. Both parties will continue to work on adaptation for large model inference, optimizing technology to meet high-frequency inference demands.

Liu Chunjiang, CEO of ShiZhe Large Model, said: "This training test showcases the powerful performance of the Kua e thousand-card intelligent computing cluster, and we are full of confidence in domestic computing power. In the future, ShiZhe Large Model will continue to operate more core businesses on the Kua e thousand-card intelligent computing cluster, providing users with efficient and stable computing power services."