OLMo 2 13B is a transformer-based autoregressive language model developed by the Allen Institute for AI (AI2), focusing on English academic benchmark testing. During training, it utilized up to 50 trillion tokens, demonstrating performance comparable to or even superior to similarly sized open models, and competing with the open-weight models from Meta and Mistral on English academic benchmarks. The release of OLMo 2 13B includes all code, checkpoints, logs, and relevant training details, aimed at advancing scientific research in language models.