SLM_Survey
Research, Measurement, and Insights on Small Language Models
CommonProductEducationSmall Language ModelsTransformer
SLM_Survey is a dedicated research project focused on small language models (SLMs), aiming to provide an in-depth understanding and technical assessment of these models through research and measurement. The project covers transformer-based, decoder-only language models with parameter sizes ranging from 100M to 5B. By investigating 59 state-of-the-art open-source SLMs, it analyzes their technological innovations and evaluates their capabilities across multiple domains, including common-sense reasoning, context learning, mathematics, and programming. Additionally, runtime costs such as inference latency and memory usage have been benchmarked. This research is of significant value in advancing the study of SLMs.
SLM_Survey Visit Over Time
Monthly Visits
494758773
Bounce Rate
37.69%
Page per Visit
5.7
Visit Duration
00:06:29