OLMoE

An open-source expert mixture language model with 130 million active parameters.

CommonProductProductivityNatural Language ProcessingExpert Mixture Model
OLMoE is a fully open, state-of-the-art expert mixture model with 130 million active parameters and a total of 690 million parameters. All data, code, and logs associated with the model have been released. It provides a comprehensive overview of resources related to the paper 'OLMoE: Open Mixture-of-Experts Language Models'. This model has significant applications in pre-training, fine-tuning, adaptation, and evaluation, marking a milestone in the field of natural language processing.
Visit

OLMoE Visit Over Time

Monthly Visits

503747431

Bounce Rate

37.31%

Page per Visit

5.7

Visit Duration

00:06:44

OLMoE Visit Trend

OLMoE Visit Geography

OLMoE Traffic Sources

OLMoE Alternatives