SmolLM2-1.7B

A lightweight language model with 1.7 billion parameters, suitable for diverse tasks.

CommonProductProgrammingText GenerationLightweight Model
SmolLM2 is a series of lightweight language models, featuring versions with 135M, 360M, and 1.7B parameters. These models effectively handle a wide range of tasks while maintaining a lightweight profile, particularly for device deployment. The 1.7B version shows significant improvements over its predecessor, SmolLM1-1.7B, in instruction-following, knowledge, reasoning, and mathematics. It has been trained on multiple datasets, including FineWeb-Edu, DCLM, and The Stack, and has undergone Direct Preference Optimization (DPO) using UltraFeedback. The model also supports tasks such as text rewriting, summarization, and functional invocation.
Visit

SmolLM2-1.7B Visit Over Time

Monthly Visits

19075321

Bounce Rate

45.07%

Page per Visit

5.5

Visit Duration

00:05:32

SmolLM2-1.7B Visit Trend

SmolLM2-1.7B Visit Geography

SmolLM2-1.7B Traffic Sources

SmolLM2-1.7B Alternatives