2024-11-08 11:01:50.AIbase.13.1k
Researchers Discover an Energy-Efficient Method for Training Large Language Models, Reducing Energy Consumption by 30%
Recently, a new study from the University of Michigan found that an energy-efficient method for training large language models can achieve similar results in the same amount of time while reducing energy consumption by 30%. This method can save enough energy to power 1.1 million American households by 2026. The researchers developed a software tool named Perseus that identifies the critical path, which is a series of subtasks requiring the longest time to complete. Then, Perseus reduces the processor speed on non-critical paths so that all tasks can finish simultaneously.