2023-12-19 08:34:36.AIbase.4.3k
Yuanxiang Open Source High-Performance Large Model XVERSE-65B-2 Base Version Enhancing Code and Math Capabilities
The Yuanxiang Open Source High-Performance Large Model XVERSE-65B-2 Base Version focuses on enhancing code and math capabilities. XVERSE-65B-2 has been optimized in Continual Pre-Training, with a total training token volume of 3.2 trillion. Yuanxiang's large model surpasses GPT-3.5 and Llama2-70B in comprehensive evaluations, and it significantly exceeds open-source benchmarks. XVERSE-65B performs exceptionally well in SuperCLUE Chinese general large model tasks.