OLMoE

An open-source expert mixture language model with 130 million active parameters.

CommonProductProductivityNatural Language ProcessingExpert Mixture Model
OLMoE is a fully open, state-of-the-art expert mixture model with 130 million active parameters and a total of 690 million parameters. All data, code, and logs associated with the model have been released. It provides a comprehensive overview of resources related to the paper 'OLMoE: Open Mixture-of-Experts Language Models'. This model has significant applications in pre-training, fine-tuning, adaptation, and evaluation, marking a milestone in the field of natural language processing.
Visit

OLMoE Visit Over Time

Monthly Visits

515580771

Bounce Rate

37.20%

Page per Visit

5.8

Visit Duration

00:06:42

OLMoE Visit Trend

OLMoE Visit Geography

OLMoE Traffic Sources

OLMoE Alternatives