MAP-NEO
MAP-NEO is an entirely open-source large language model offering advanced natural language processing capabilities.
CommonProductProgrammingNatural Language ProcessingOpen Source
MAP-NEO is an open-source large language model, which includes pre-trained data, data processing pipelines (Matrix), pre-training scripts, and alignment code. The model has been trained from scratch, using 4.5T of English and Chinese tokens and demonstrating performance comparable to LLaMA2 7B. MAP-NEO excels in challenging tasks such as inference, mathematics, and coding, outperforming models of similar scale. In pursuit of transparency in the LLM training process for research purposes, we have fully released MAP-NEO, including final and intermediate checkpoints, self-trained tokenizer, pre-trained corpus, as well as a highly efficient and stable optimized pre-training code library.
MAP-NEO Visit Over Time
Monthly Visits
515580771
Bounce Rate
37.20%
Page per Visit
5.8
Visit Duration
00:06:42