The data to be translated: Meta has released the Llama2-Long model, which handles long texts without increasing computational demands while maintaining excellent performance. By employing continuous pre-training, improved positional encoding, and data mixing strategies, it reduces computational overhead by up to 40%. It excels in both short and long tasks, even surpassing GPT-3.5. This release has infused new vitality into the field of natural language processing.