Recently, the sharp increase in demand for artificial intelligence computing has attracted widespread attention in the industry. Thomas Graham, co-founder of the optical computing startup Lightmatter, stated in an interview with Bloomberg that by 2026, multiple AI data centers requiring massive amounts of power are expected to be built globally, with their electricity consumption being eight times that of New York City.
Image Source Note: Image generated by AI, image licensed by service provider Midjourney
In the interview, Graham mentioned that technology companies like Nvidia are continuously expanding large computing facilities worldwide to meet the demand for training large AI models, such as GPT-4. As more AI models enter production, the demand for computing power will continue to rise. He pointed out that as AI transitions from the research and development stage to the deployment stage, the demand for large-scale computing will significantly increase. He emphasized that the demand for inference computing is growing at an exponential rate.
Graham also discussed Lightmatter's innovative technology. The company focuses on developing optical computing chips, which can connect multiple processors on a single semiconductor chip and replace traditional network links with optical connections. This optical interconnect technology allows for data transmission with lower energy consumption and faster speeds, making the network structure of data centers more efficient and economical.
He noted that there are currently at least twelve new AI data centers under construction or planned, each requiring one gigawatt of power. New York City consumes about five gigawatts of electricity daily, and the future global AI data centers are expected to need forty gigawatts of power, equivalent to the electricity consumption of eight New York Cities.
Lightmatter recently secured $400 million in venture capital, bringing the company's valuation to $4.4 billion. Graham stated that the company will enter the production phase in the coming years. He is confident about expanding AI computing infrastructure, although he mentioned that the emergence of new algorithms that can perform AI computing more efficiently could impact the industry's investment in computing power.
Key Points:
- ⚡ It is expected that by 2026, the global electricity demand for AI data centers will reach 40 gigawatts, equivalent to the power consumption of eight New York Cities.
- 💻 Optical computing startup Lightmatter is developing new optical chips to enhance the computing efficiency of data centers and reduce energy consumption.
- 📈 Several large AI data centers are currently under construction, demonstrating the urgent need for AI computing infrastructure.