Recently, Caiyun Technology held a themed communication meeting titled "From Paper to App" in Beijing, officially launching the general large model "Yunjin Tianzhang" based on the DCFormer architecture, and announcing the upgrade of its AI RPG platform Caiyun Xiaomeng to version V3.5 based on DCFormer. This marks a significant breakthrough in model architecture efficiency in the field of artificial intelligence.

In the AI sector, the Transformer architecture has been the core technological support for mainstream large models like ChatGPT and Gemini. This year, Caiyun Technology published a paper titled "Improving Transformers with Dynamically Composable Multi-Head Attention" at the prestigious international conference ICML, introducing the DCFormer architecture for the first time. Tests show that the DCPythia-6.9B model developed based on this architecture achieved a significant performance improvement of 1.7 to 2 times over traditional Transformer models.

Regarding the energy challenges faced by AI development, Caiyun Technology's CEO Yuan Xingyuan pointed out that, according to predictions, global AI power consumption could reach 8 times the current electricity generation capacity of the Earth by 2050. NVIDIA's CEO Jensen Huang more vividly stated that at the current development pace, we may need "14 planets, 3 galaxies, and 4 suns" to support AI energy needs in the future.

To address this dilemma, Caiyun Technology has chosen to improve the underlying architecture of the model. The DCFormer introduces a dynamically composable multi-head attention (DCMHA) mechanism, which frees the attention heads from their fixed bindings in traditional multi-head attention (MHA) modules, allowing for more flexible dynamic combinations and significantly enhancing the model's expressive capability. This innovation resulted in an average score of 7 for the three papers presented by Caiyun Technology at the ICML conference, making it one of the only two domestic companies invited to speak at ICML 2024 in Vienna.

As the first product to implement the DCFormer architecture, the new version of Caiyun Xiaomeng demonstrates exceptional performance: it supports long text inputs of up to 10,000 characters, with story background settings reaching 10,000 characters, and improves overall fluency and coherence by 20%. This means that the AI can better maintain narrative consistency, ensure character personality consistency, and possess the ability to reflect on and revise the plot.

As one of the earliest companies in China to engage in large language models, Caiyun Technology currently offers three profitable AI products: Caiyun Weather, Caiyun Xiaomeng, and Caiyun Xiaoyi. The company stated it will continue to increase its investment in the research and development of DCFormer, aiming to break the traditional pattern of "foreign technology level, domestic application level" and promote domestic AI technology to gain an advantageous position in global competition.

Through this technological breakthrough, Caiyun Technology not only showcases the strength of Chinese companies in AI infrastructure innovation but also provides new ideas for solving the energy bottleneck in AI development, potentially accelerating the sustainable development of AI technology.