Recently, Baichuan Intelligent announced the open-source release of two large models, Baichuan2-7B and Baichuan2-13B, which have demonstrated superior performance in multiple Chinese and English general benchmarks, significantly outperforming Meta's Llama2. Baichuan2 utilizes a vast training dataset spanning 2.6TB across multiple domains, achieving an efficient, stable, and predictable training process. Additionally, Baichuan Intelligent has open-sourced 11 intermediate checkpoints, which are of significant value for ongoing research on the model. It is noteworthy that the Baichuan2 series models are available for free commercial use, making them an excellent choice for domestic enterprises.