MaLA-500
A large language model covering 534 languages
CommonProductOthersLanguage ModelNatural Language Processing
MaLA-500 is a large language model aimed at covering 534 languages. Trained using vocabulary expansion and continuous pre-training on LLaMA 2, as well as Glot500-c, MaLA-500 achieves state-of-the-art results in contextual learning as demonstrated by experiments on SIB-200. This model is positioned to improve natural language processing performance for low-resource languages.
MaLA-500 Visit Over Time
Monthly Visits
19075321
Bounce Rate
45.07%
Page per Visit
5.5
Visit Duration
00:05:32