Nemotron-4-340B-Base
A large language model supporting text generation in multiple languages and programming languages.
PremiumNewProductProgrammingLarge Language ModelMultilingual Support
Nemotron-4-340B-Base is a large language model developed by NVIDIA, boasting 340 billion parameters and a context length of 4096 tokens. It is suitable for generating synthetic data and aiding researchers and developers in building their own large language models. The model has been pre-trained on 9 trillion tokens, encompassing over 50 natural languages and 40 programming languages. The NVIDIA open model license permits commercial use and the creation and distribution of derivative models, without claiming ownership of any output generated by using the model or any derived models.
Nemotron-4-340B-Base Visit Over Time
Monthly Visits
19075321
Bounce Rate
45.07%
Page per Visit
5.5
Visit Duration
00:05:32