Recently, IBM announced the release of version 3.1 of its open-source language model, Granite, which brings many significant improvements. The new version features a redesigned, denser architecture that can handle up to 128,000 tokens at once. This update means that Granite's ability to process complex texts and tasks has been significantly enhanced.

IBM

The Granite 3.1 model has been trained on datasets covering 12 languages and 116 programming languages, processing a total of 12 trillion tokens. This makes the model's performance in language understanding and generation even more outstanding, better meeting user needs. IBM stated that these new models excel in tasks such as answering questions using external data (RAG), extracting information from unstructured text, and creating document summaries.

Developers can now access these models through the Hugging Face platform, providing powerful support for various application scenarios. Granite was first launched in May 2024, and this update marks IBM's ongoing progress and innovation in the open-source AI field.

IBM's Granite model is not just a technical enhancement but also provides developers and enterprises with more flexible and powerful tools, enabling them to process and analyze data more efficiently. As AI technology continues to evolve, the Granite model will continue to play a vital role in helping various industries achieve digital transformation.

Through this series of improvements, IBM hopes to attract more developers to participate in the open-source community to jointly promote the development of AI technology. The release of Granite 3.1 is not only a technological leap but also a positive push for future research in language models.

Project link: https://huggingface.co/collections/ibm-granite/granite-31-language-models-6751dbbf2f3389bec5c6f02d

Highlights:

🌟 The redesigned Granite 3.1 model can process up to 128,000 tokens.

🌍 The model's training data includes 12 languages and 116 programming languages, totaling 12 trillion tokens processed.

💻 Developers can access these powerful open-source language models through the Hugging Face platform.