The new architecture Mamba challenges the Transformer, being comparable to or surpassing it in language modeling. Mamba achieves linear scaling and improves inference throughput by 5 times, widely applicable in fields such as language, audio, and genomics. The introduction of the Selective State Space Model (S4) is one of Mamba's innovations and has achieved significant success. Hardware-aware algorithms address technical challenges in model computation, and the model code and pre-trained checkpoints have been open-sourced.