Recently, a study conducted by the nonprofit organization Epoch AI revealed the energy consumption of OpenAI's chatbot platform, ChatGPT. The results show that ChatGPT's energy consumption is significantly lower than earlier estimates. According to some reports, it was estimated that ChatGPT requires about 3 watt-hours of electricity to answer a question, but Epoch AI's research suggests this figure is overestimated. The study indicates that when using OpenAI's latest default model, GPT-4o, the average energy consumption per query is only about 0.3 watt-hours, which is even lower than that of many household appliances.
Joshua You, a data analyst at Epoch AI, stated that traditional concerns about AI energy consumption do not accurately reflect the current situation. The earlier estimate of 3 watt-hours was primarily based on outdated research and assumptions, which presumed that OpenAI was using less efficient chips. You also noted that while there are reasonable public concerns about the future energy consumption of AI, there is a lack of clarity regarding the current situation.
However, You acknowledged that the 0.3 watt-hour figure from Epoch is still an approximation, as OpenAI has not publicly disclosed detailed energy consumption calculation data. Additionally, this analysis did not account for the energy consumption of some extra features, such as image generation or input processing. For longer input queries, such as those involving a large number of files, You mentioned that the energy consumption for such queries could be higher.
Despite the current low energy consumption data, You expects that energy consumption may rise in the future. He mentioned that as AI technology advances, the energy demands for training these models may increase, and future AI may undertake more complex tasks, thereby consuming more electricity.
Meanwhile, the infrastructure for AI is rapidly expanding, which will lead to significant electricity demand. For example, according to a report by Rand Corporation, AI data centers are expected to require nearly all of California's electricity supply from 2022 within the next two years. By 2030, the energy consumption for training a cutting-edge model is projected to reach the equivalent of the power output of eight nuclear reactors.
OpenAI and its investment partners plan to invest billions of dollars in new AI data center projects over the next few years. As AI technology develops, the industry's focus is also beginning to shift towards inference models, which are more capable in handling tasks but also require more computational power and electricity support.
For those concerned about their AI energy consumption, You suggests reducing usage frequency or opting for models with lower computational demands.
Key Points:
🌱 The average energy consumption of ChatGPT is 0.3 watt-hours, significantly lower than the earlier estimate of 3 watt-hours.
🔌 The increase in AI energy consumption is mainly related to future technological advancements and the handling of more complex tasks.
🏭 OpenAI plans to invest heavily in expanding AI data centers in the coming years to meet the growing electricity demand.