TinyLlama
The TinyLlama project aims to pre-train a 1.1B Llama model on 3 trillion tokens. With some optimizations, we can achieve this in just 90 days using 16 A100-40G GPUs. Training began on 2023-09-01.
CommonProductchattingPre-trained ModelChat
The TinyLlama project aims to pre-train a 1.1B Llama model on 3 trillion tokens. With some optimizations, we can achieve this in just 90 days using 16 A100-40G GPUs. Training began on 2023-09-01. We adopt the same architecture and tokenizer as Llama 2. This means TinyLlama can be used in many open-source projects built on top of Llama. Additionally, with only 1.1B parameters, TinyLlama's compactness allows it to meet the needs of many applications with limited computational and memory resources.
TinyLlama Visit Over Time
Monthly Visits
17788201
Bounce Rate
44.87%
Page per Visit
5.4
Visit Duration
00:05:32