Translated data: Microsoft's research team has introduced Orca-Math, a small language model fine-tuned on the Mistral-7B architecture with 700 million parameters. By redefining instructional methods through an iterative learning mechanism, Orca-Math has achieved significant results on the GSM8K benchmark. The innovative approach and efficient operation of Orca-Math demonstrate the potential of SLMs in the field of education.