AWS India launches 'AI Ready' program to provide free AI training to 2 million people by 2025

In the fiercely competitive AI landscape, a million-dollar experiment is quietly revolutionizing large language model (LLM) training. Jieyue Xingchen's research team recently released groundbreaking findings. By utilizing nearly 1 million NVIDIA H800 GPU hours, they trained 3,700 models of varying sizes from scratch, accumulating a staggering 100 trillion tokens. This revealed a universal scaling law dubbed 'Step Law,' offering a novel guide for efficient LLM training.
According to exclusive news from Blue Whale News, Luying Technology (Avolution.ai), an AI video startup, is reportedly being acquired by prominent AI company MiniMax. Multiple sources revealed that both parties have reached a preliminary agreement on the acquisition, and relevant procedures are underway. As of press time, MiniMax has not responded to this news. Luying Technology's valuation during its Angel round financing in 2024 was reportedly around 100 million RMB, less than 20 million USD. Sources say Luying Technology has been seeking a second round of funding since last year.
AI startup Luma recently announced on X that it has open-sourced a new image model pretraining technique called Inductive Moment Matching (IMM). This breakthrough technology has garnered significant attention for its efficiency and stability, considered a major advancement in the generative AI field. According to X user linqi_zhou, IMM is a novel generative paradigm capable of stable training from scratch with a single model and single objective, surpassing existing methods in both sampling efficiency and sample quality.