H2O-Danube-1.8B

A 1.8B parameter language model, open-source and free.

CommonProductProductivityLanguage ModelNatural Language Processing
H2O-Danube-1.8B is a 1.8B parameter language model trained on 1T tokens, adhering to the core principles of LLaMA 2 and Mistral. Despite using a significantly smaller total token count during training compared to similar-sized reference models, it demonstrates highly competitive performance on multiple benchmark tests. Furthermore, we have released a fine-tuned chat model that has undergone both supervised fine-tuning and direct preference optimization training. H2O-Danube-1.8B is open-sourced under the Apache 2.0 license, further democratizing access to large language models and enabling a wider audience to benefit economically.
Visit

H2O-Danube-1.8B Visit Over Time

Monthly Visits

17788201

Bounce Rate

44.87%

Page per Visit

5.4

Visit Duration

00:05:32

H2O-Danube-1.8B Visit Trend

H2O-Danube-1.8B Visit Geography

H2O-Danube-1.8B Traffic Sources

H2O-Danube-1.8B Alternatives