Falcon Mamba

The first 7B large-scale model that operates without an attention mechanism.

CommonProductProgrammingLarge ModelsNo Attention
Falcon Mamba is the first 7B large-scale model released by the Technology Innovation Institute (TII) in Abu Dhabi that does not use attention mechanisms. This model is free from the computational and storage costs that increase with longer sequences, while still maintaining performance on par with current state-of-the-art models.
Visit

Falcon Mamba Visit Over Time

Monthly Visits

20899836

Bounce Rate

46.04%

Page per Visit

5.2

Visit Duration

00:04:57

Falcon Mamba Visit Trend

Falcon Mamba Visit Geography

Falcon Mamba Traffic Sources

Falcon Mamba Alternatives