OLMo 2 13B

High-performance English academic benchmark language model

CommonProductProductivityLanguage ModelNatural Language Processing
OLMo 2 13B is a transformer-based autoregressive language model developed by the Allen Institute for AI (AI2), focusing on English academic benchmark testing. During training, it utilized up to 50 trillion tokens, demonstrating performance comparable to or even superior to similarly sized open models, and competing with the open-weight models from Meta and Mistral on English academic benchmarks. The release of OLMo 2 13B includes all code, checkpoints, logs, and relevant training details, aimed at advancing scientific research in language models.
Visit

OLMo 2 13B Visit Over Time

Monthly Visits

19075321

Bounce Rate

45.07%

Page per Visit

5.5

Visit Duration

00:05:32

OLMo 2 13B Visit Trend

OLMo 2 13B Visit Geography

OLMo 2 13B Traffic Sources

OLMo 2 13B Alternatives