The world-leading technology research center in Abu Dhabi, TII, has released a top-tier open-source large model — Falcon 180B. This model, trained on 3.5 trillion tokens, boasts 180 billion parameters, surpassing previous open-source models like Llama2 and coming close to Google's GPT-4 in performance. Falcon 180B excels in various tasks and is considered one of the best open-source large models currently available. The model is available for free commercial use and offers a conversational version, allowing anyone to try it out. The training data primarily comes from the RefinedWe dataset, which includes a variety of data such as dialogues, technical papers, and code.