Translated data: The OneBit method, proposed collaboratively by Tsinghua University and Harbin Institute of Technology, has successfully compressed large models to 1 bit while maintaining 83% of their performance. This method has broken through the previous 2-bit limitation, adopting 1-bit quantization, which has garnered widespread attention in the academic community. By integrating 1-bit layer structures, SVID-based parameter initialization, and quantization-aware training, this approach has pioneered a new field. This breakthrough signifies new possibilities for deploying large models on PCs and smartphones, potentially realizing the vision of efficiently running large models on mobile devices.