en
每月不到10元,就可以无限制地访问最好的AIbase。立即成为会员
Home
News
Daily Brief
Income Guide
Tutorial
Tools Directory
Product Library
en
Search AI Products and News
Explore worldwide AI information, discover new AI opportunities
AI News
AI Tools
AI Cases
AI Tutorial
Type :
AI News
AI Tools
AI Cases
AI Tutorial
2024-09-13 11:08:52
.
AIbase
.
11.7k
Yuanxiang Releases MoE Open Source Large Model XVERSE-MoE-A36B with 36 Billion Active Parameters
Shenzhen Yuanxiang Information Technology Co., Ltd. recently announced the successful release of China’s largest Mixture of Experts (MoE) open source large model — XVERSE-MoE-A36B. The launch of this model marks a significant advancement in China’s AI field, elevating domestic open source technology to an internationally leading level.