The Southern University of Science and Technology, in collaboration with the IDEA Institute's CCNL Center, has released the SUS-Chat-34B bilingual model, which boasts a parameter scale of 34 billion. This model excels in both Chinese and English tasks, surpassing other models of the same parameter size. SUS-Chat-34B is fine-tuned on the Yi-34B pre-trained model, utilizing millions of high-quality, multilingual instruction data for training. The model features extensive complex instruction-following data, robust performance across general tasks, and an extended context window with exceptional multi-turn dialogue capabilities.