Recently, Elon Musk's xAI company achieved a groundbreaking technological breakthrough that has shaken the industry, introducing an artificial intelligence training system named "Colossus."

The core of this system consists of a whopping 100,000 Nvidia H100 GPUs, marking another significant advancement in the AI field for xAI.

Musk shared this news on his social platform X on Monday, claiming that Colossus is "the most powerful AI training system in the world." This implies that Colossus's performance could potentially surpass the Department of Energy's Aurora supercomputer, which reached an impressive speed of 10.6 exaflops in a benchmark test.

image.png

The strength of Colossus primarily stems from its GPUs. Since its release in 2022, Nvidia's H100 GPU has been regarded as the most advanced AI processor.

This GPU, when running language models, is 30 times faster than previous GPUs, especially when handling tasks based on the Transformer neural network architecture.

To further enhance Colossus's performance, Musk plans to increase the number of GPUs to 200,000 within the next few months, including 50,000 of the new generation H200 GPUs.

The H200 GPU offers significant improvements in data transfer speed over the H100, which will greatly enhance the efficiency of AI models. For xAI, such a technological upgrade means they can develop more powerful language models.

In fact, xAI's flagship product, Grok-2, was trained on 15,000 GPUs, and now with Colossus's powerful computational capabilities, the development of its subsequent versions will be smoother. xAI plans to release the successor to Grok-2 by the end of the year.

It's worth noting that some of the server GPUs for Colossus were originally reserved for Tesla. According to previous reports, Musk had requested Nvidia to transfer 12,000 H100 GPUs worth over $500 million to xAI.

By the end of the year, Tesla's expenditure on Nvidia hardware is expected to reach between $3 billion and $4 billion, showing Musk's ambition in both the AI and electric vehicle fields.

Key Points:

🌟 xAI's "Colossus" system is equipped with 100,000 Nvidia H100 GPUs, claiming to be the world's most powerful AI training system.

🚀 Musk plans to double the number of GPUs in Colossus to 200,000 within a few months, including 50,000 new H200 GPUs.

💡 Colossus will provide powerful computational support for the successor to xAI's flagship large language model, Grok-2, which is expected to be released by the end of the year.