During the Fu Sheng 2024 Opening AI Grand Lecture and the Orion Star Large Model Launch Event, Fu Sheng unveiled the Orion Star Large Model (Orion-14B). This model boasts a parameter scale of 14 billion, covers common languages and professional terminology, and has achieved the best results on multiple third-party test sets. The Orion Star Large Model supports ultra-long texts, up to 320K tokens, with a reasoning speed of 31 tokens per second. It excels particularly in multilingual capabilities, especially in Japanese and Korean. To meet the application needs of enterprises, Orion Star also introduced fine-tuned models and several applications to help enterprises enhance operational efficiency and decision-making capabilities.