Shanghai Artificial Intelligence Laboratory introduced the new version of the InternLM series models, InternLM2.5, at the WAIC Scientific Frontier Main Forum on July 4, 2024. This version has seen comprehensive enhancements in reasoning capabilities under complex scenarios, supports a 1M ultra-long context, and can autonomously perform internet searches and integrate information from hundreds of web pages.

InternLM2.5 has released three parameter versions of the model: 1.8B, 7B, and 20B, to cater to different application scenarios and developer needs. The 1.8B version is an ultra-lightweight model, while the 20B version offers stronger comprehensive performance, supporting more complex practical scenarios. All these models are open-source and can be found on the InternLM series large model homepage, ModelScope homepage, and Hugging Face homepage.

WeChat Screenshot_20240807133852.png

InternLM2.5 has undergone iterations in multiple data synthesis techniques, significantly enhancing the model's reasoning capabilities, especially achieving an accuracy rate of 64.7% on the MATH evaluation set. Additionally, the model has improved its ability to handle context lengths through efficient training during the pre-training phase.

The InternLM2.5 series models also achieve seamless integration with downstream inference and fine-tuning frameworks, including the XTuner fine-tuning framework and LMDeploy inference framework developed by Shanghai Artificial Intelligence Laboratory, as well as other frameworks with a broad user base in the community such as vLLM, Ollama, and llama.cpp. The SWIFT tool launched by the ModelScope community also supports the inference, fine-tuning, and deployment of the InternLM2.5 series models.

These models offer application experiences including multi-step complex reasoning, precise understanding of multi-round dialogue intents, flexible format control operations, and the ability to follow complex instructions. Detailed installation and usage guides are provided to facilitate developers' quick start.

InternLM Series Large Model Homepage:

https://internlm.intern-ai.org.cn

ModelScope Homepage:

https://www.modelscope.cn/organization/Shanghai_AI_Laboratory?tab=model

Hugging Face Homepage:

https://huggingface.co/internlm

InternLM2.5 Open Source Link:

https://github.com/InternLM/InternLM