OLMoE is a fully open, state-of-the-art expert mixture model with 130 million active parameters and a total of 690 million parameters. All data, code, and logs associated with the model have been released. It provides a comprehensive overview of resources related to the paper 'OLMoE: Open Mixture-of-Experts Language Models'. This model has significant applications in pre-training, fine-tuning, adaptation, and evaluation, marking a milestone in the field of natural language processing.