Falcon Mamba

The first 7B large-scale model that operates without an attention mechanism.

CommonProductProgrammingLarge ModelsNo Attention
Falcon Mamba is the first 7B large-scale model released by the Technology Innovation Institute (TII) in Abu Dhabi that does not use attention mechanisms. This model is free from the computational and storage costs that increase with longer sequences, while still maintaining performance on par with current state-of-the-art models.
Visit

Falcon Mamba Visit Over Time

Monthly Visits

19075321

Bounce Rate

45.07%

Page per Visit

5.5

Visit Duration

00:05:32

Falcon Mamba Visit Trend

Falcon Mamba Visit Geography

Falcon Mamba Traffic Sources

Falcon Mamba Alternatives