Translated data: Meta researchers recently released a new study on the Transformer architecture, introducing a novel attention mechanism called System2Attention. This mechanism addresses the issue of insufficient reasoning ability in complex tasks by adjusting the language model's attention. Experiments have shown that System2Attention performs better across various tasks, providing an innovative solution for enhancing the reasoning capabilities of large language models.