2024-09-13 11:08:52.AIbase.11.7k
Yuanxiang Releases MoE Open Source Large Model XVERSE-MoE-A36B with 36 Billion Active Parameters
Shenzhen Yuanxiang Information Technology Co., Ltd. recently announced the successful release of China’s largest Mixture of Experts (MoE) open source large model — XVERSE-MoE-A36B. The launch of this model marks a significant advancement in China’s AI field, elevating domestic open source technology to an internationally leading level.