Tele-FLM

An open-source multilingual large language model with 52 billion parameters

PremiumNewProductProgrammingLarge Language ModelMultilingual Support
Tele-FLM (also known as FLM-2) is a 52-billion parameter open-source multilingual large language model with a stable and efficient pre-training paradigm and enhanced fact-checking capabilities. Based on a decoder-only transformer architecture, it has been trained on approximately 2 trillion tokens. Tele-FLM exhibits superior performance compared to models of similar size, sometimes even surpassing larger ones. Besides sharing the model weights, we also provide core design, engineering practices, and training details, hoping they will benefit both the academic and industrial communities.
Visit

Tele-FLM Visit Over Time

Monthly Visits

19075321

Bounce Rate

45.07%

Page per Visit

5.5

Visit Duration

00:05:32

Tele-FLM Visit Trend

Tele-FLM Visit Geography

Tele-FLM Traffic Sources

Tele-FLM Alternatives