Tongyi Qwen 72B Model Ranks First in Hugging Face Open Source Pre-trained Model List
站长之家
96
The Qwen-72B model from Tongyi Qianwen has claimed the top spot on Hugging Face's Open Source Large Model排行榜, boasting 72 billion parameters and a comprehensive score of 73.6. Hugging Face's Open Source Large Model排行榜 evaluates the world's leading open-source large models across six dimensions: reading comprehension, logical reasoning, mathematical computation, factual question answering, and more. The Qwen-72B model from Tongyi Qianwen excels in these areas, securing the number one position among pre-trained models. Tongyi Qianwen, launched by Alibaba Cloud, is an ultra-large-scale language model that supports multi-round dialogue, copywriting creation, logical reasoning, multi-modal understanding, and multilingual support.
© Copyright AIbase Base 2024, Click to View Source - https://www.aibase.com/news/4025