As artificial intelligence continues to advance, balancing innovation and sustainable development has become a significant challenge. Recently, OpenAI launched its latest AI model, o3, which is the most powerful model to date. However, beyond the costs of running these models, their environmental impact has also drawn widespread attention.
A study shows that each o3 task consumes approximately 1,785 kilowatt-hours of electricity, which is equivalent to the electricity usage of an average American household over two months. According to Boris Gamazaychikov, head of AI sustainability at Salesforce, this electricity consumption corresponds to about 684 kilograms of carbon dioxide equivalent emissions, which is comparable to the carbon emissions from five full gasoline tanks.
The high-computation version of o3 was benchmarked under the ARC-AGI framework, with calculations based on the energy consumption of standard GPUs and grid emission factors. Gamazaychikov stated, “As technology continues to expand and integrate, we need to pay more attention to these trade-offs.” He also noted that this calculation did not account for embedded carbon, focusing solely on GPU energy consumption, meaning actual emissions may be underestimated.
Additionally, data scientist Kasper Groes Albin Ludvigsen indicated that the energy consumption of an HGX server equipped with eight Nvidia H100 GPUs ranges between 11 to 12 kilowatts, far exceeding the 0.7 kilowatts per GPU.
Regarding task definition, Pierre-Carl Langlais, co-founder of Pleias, expressed concerns about model design, especially if it cannot be quickly scaled down. “Solving complex mathematical problems requires a lot of drafts, intermediate tests, and reasoning,” he said.
Earlier this year, research showed that ChatGPT consumes an average of 10% of a human's daily water intake during a single conversation, which is nearly half a liter of water. While this figure may seem small, the total water consumption becomes quite substantial when millions of people use this chatbot daily.
Kathy Baxter, chief architect of AI technology at Salesforce, warned that advancements in AI, like OpenAI's o3 model, may lead to the Jevons Paradox. “While the energy required may decrease, water usage may increase,” she said.
In response to the challenges faced by AI data centers, such as high energy consumption, complex cooling needs, and massive physical infrastructure, companies like Synaptics and embedUR are attempting to address these issues through edge AI, aiming to reduce reliance on data centers, lower latency and energy consumption, and enable real-time decision-making at the device level.
Key Points:
🌍 Each o3 task's electricity consumption is equivalent to an average household's usage over two months.
⛽ The carbon emissions from each task are comparable to those from five full gasoline tanks.
💧 The water consumed in a ChatGPT conversation reaches 10% of an average human's daily water intake.