As the developer of the Llama foundation language model, Meta points out that the computing power required for future model training will significantly increase, with the expected training amount for Llama 4 being 10 times that of Llama 3. In light of this trend, Meta emphasizes the importance of building training capabilities in advance to avoid falling behind in the competition. Meta has released Llama 3, which has 80 billion parameters, and its upgraded version Llama 3.1, which reaches 405 billion parameters, making it its largest open-source model. To meet future model demands, Meta is