Recently, Musk officially launched the new chatbot Grok3 during a live stream and revealed the astonishing cost of training this model. Grok3 is currently available to Premium + subscribers and has excelled in multiple evaluation areas, even surpassing other competitors in the market, such as Gemini, DeepSeek, and ChatGPT.
Musk introduced during the live stream that the training of Grok3 consumed a total of 200,000 Nvidia GPUs, a shocking number. Previously, Grok2's training only utilized about 20,000 GPUs, making Grok3's computational power a significant leap forward. To support such a massive training scale, xAI has built a supercomputing data center named "Colossus," which is considered one of the most powerful AI training facilities in the world.
With the launch of Grok3, users can experience the powerful features of this series of models through the newly established website Grok.com. Musk stated that Grok3 has shown significant improvements in reasoning, understanding, and content generation, which will further drive the development of artificial intelligence technology.
Additionally, Musk mentioned that xAI plans to expand its supercomputer GPU cluster from the current 100,000 to 1,000,000, showcasing his ambitions in the AI field. These series of actions will undoubtedly have a profound impact on the future technological landscape.
In summary, the release of Grok3 marks a significant advancement in AI technology, and Musk's ongoing investment and innovation will inject new vitality into this field.