YuLan-Mini is a lightweight language model developed by the AI Box team at Renmin University of China. With 240 million parameters, it achieves performance comparable to industry-leading models trained on larger datasets, despite using only 1.08 terabytes of pre-trained data. The model excels in mathematics and coding domains, and to facilitate reproducibility, the team will open-source relevant pre-training resources.