In the AI field, a new force is emerging. The Abu Dhabi Institute for Technology and Innovation (TII) has announced the open-source release of their new large-scale model, Falcon2. This model, with 11 billion parameters, has garnered global attention due to its outstanding performance and multilingual capabilities.

Falcon2 comes in two versions: a basic version that is easy to deploy and can generate text, code, and summaries; and another version with a Visual Language Model (VLM) feature, which is rare in open-source large models and can convert image information into text. In multiple benchmark tests for rights protection, the Falcon2 11B model outperforms Meta's Llama38B and is ranked first alongside Google's Gemma7B, proving its exceptional performance.

AI Brain, Large Model

Image Source Note: Image generated by AI, authorized by Midjourney

Falcon2 11B's multilingual capabilities enable it to handle a variety of language tasks effortlessly, including English, French, Spanish, German, and Portuguese, enhancing its application potential in different scenarios. As a visual large model, Falcon2 11B VLM has broad application potential in industries such as healthcare, finance, e-commerce, education, and law, capable of recognizing and interpreting images and visual content in the environment.

Falcon2 11B has been pre-trained on the open-source dataset RefinedWeb with over 5.5 trillion Tokens data. This dataset is of high quality, filtered, and deduplicated. TII has enhanced it through selected corpus and adopted a four-phase training strategy to improve the model's contextual understanding ability.

image.png

It is worth noting that Falcon2 is a powerful large model with low consumption, which can run efficiently with just one GPU. This makes it highly scalable, easy to deploy, and even integrated into lightweight devices like laptops, providing great convenience for small and medium-sized enterprises and individual developers, and allowing for commercial use.

Dr. Hakim Hacid, Executive Director and Acting Chief Researcher of TII's AI Cross-Center Department, stated that as generative AI technology evolves, developers recognize the advantages of small models, including reduced computing resource needs, meeting sustainability standards, and providing enhanced flexibility.

As early as May 2023, TII first open-sourced the Falcon-40B large model, ranking first on the huggingface list of open-source large language models, surpassing a series of well-known open-source models. Falcon-40B was trained on a dataset of 1 trillion tokens and is suitable for text questions and answers, summary abstracts, automatic code generation, language translation, etc., supporting fine-tuning for specific business scenarios.

TII, founded in 2020, is a research institution under the Abu Dhabi Ministry of Higher Education and Scientific Research, aiming to promote scientific research, develop cutting-edge technologies, and commercialize them to promote the economic development of Abu Dhabi and the UAE. TII currently has more than 800 research experts from 74 countries, with over 700 papers and 25 patents published, making it one of the world's leading scientific research institutions.

The open-source release of Falcon2 is not only TII's commitment to technical sharing but also a bold exploration of the future of AI development. The open-source Falcon2 will bring about a reduction in computing resource needs, meet sustainability standards, and enhance flexibility, perfectly fitting into the emerging trend of edge AI infrastructure.

Model Address: https://huggingface.co/tiiuae/falcon-11B