OLMoE-1B-7B

An efficient open-source large language model.

CommonProductProductivityLarge Language ModelOpen Source
OLMoE-1B-7B is a specialized mixture of experts large language model (LLM) with 100 million active parameters and a total of 700 million parameters, released in September 2024. This model outperforms similarly priced models and competes with larger models such as Llama2-13B. OLMoE is fully open-source and supports a variety of features, including text generation, model training, and deployment.
Visit

OLMoE-1B-7B Visit Over Time

Monthly Visits

26103677

Bounce Rate

43.69%

Page per Visit

5.5

Visit Duration

00:04:43

OLMoE-1B-7B Visit Trend

OLMoE-1B-7B Visit Geography

OLMoE-1B-7B Traffic Sources

OLMoE-1B-7B Alternatives