Elon Musk recently announced that his artificial intelligence startup, xAI, is set to launch its revolutionary large language model, Grok-2, in August, signaling the arrival of more advanced AI capabilities. Although Grok-2 has not yet been fully unveiled, Musk is already excitedly hyping up the upcoming masterpiece, Grok-3.

When discussing AI development, Musk particularly emphasized the importance of datasets and the arduous task of cleaning data for large language models (LMMs). He also did not shy away from pointing out some challenges faced by OpenAI's models in training output. Proudly, he disclosed that the Grok-3 project at xAI has employed an astonishing 100,000 NVIDIA H100 high-performance AI chips for training. This number not only showcases xAI's deep strength in AI research and development but also predicts that Grok-3 will bring an unprecedented "extraordinary" experience.

WeChat Screenshot_20240702150255.png

The NVIDIA H100, a cutting-edge chip designed specifically for processing large language model (LLM) data, is priced at $30,000 to $40,000 per chip (approximately 21.9 to 29.2 million RMB). The 100,000 chips invested by xAI have a value that reaches sky-high levels, estimated to be between $30 to $40 billion (about 218.68 to 291.57 billion RMB at current exchange rates).

It is worth noting that Musk had previously revealed that Tesla's procurement budget for NVIDIA chips this year is also in this range, which inevitably leads one to speculate whether xAI has leveraged Tesla's resources.

With the upcoming launch of Grok-2 and the intense preparation for Grok-3, xAI under Musk's leadership is leading a new round of transformation in the field of artificial intelligence.