YuLan-Mini

A highly efficient lightweight language model with 240 million parameters.

CommonProductProgrammingLanguage ModelNatural Language Processing
YuLan-Mini is a lightweight language model developed by the AI Box team at Renmin University of China. With 240 million parameters, it achieves performance comparable to industry-leading models trained on larger datasets, despite using only 1.08 terabytes of pre-trained data. The model excels in mathematics and coding domains, and to facilitate reproducibility, the team will open-source relevant pre-training resources.
Visit

YuLan-Mini Visit Over Time

Monthly Visits

494758773

Bounce Rate

37.69%

Page per Visit

5.7

Visit Duration

00:06:29

YuLan-Mini Visit Trend

YuLan-Mini Visit Geography

YuLan-Mini Traffic Sources

YuLan-Mini Alternatives