A recent study from the University of Michigan has discovered an energy-efficient method for training large language models that can achieve the same results in the same amount of time while reducing energy consumption by 30%. This method could save enough energy to power 1.1 million American households by 2026.

Researchers have developed a software tool named Perseus, which identifies the critical path—a series of sub-tasks that take the longest to complete. Perseus then reduces the processor speed on non-critical paths so that all tasks can be completed simultaneously, eliminating unnecessary power consumption.

Robot AI

The team tested Perseus by training GPT-3, three other large language models, and a computer vision model. The results showed that Perseus can reduce the energy consumption of AI training while maintaining the same training speed.

Researchers emphasize that this energy-saving method is crucial for equitable access to artificial intelligence. If a country lacks sufficient electricity to run large models, they may need to use remote services or operate smaller, less accurate models. This gap could further exacerbate disparities between different communities.

The study demonstrates that optimizing AI training methods can reduce energy consumption while maintaining the same training speed, which is significant for energy conservation and reducing carbon footprints.