AIbase
Product LibraryTool Navigation

ModuleFormer

Public

ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.

Creat2023-08-25T00:44:48
Update2025-03-16T14:12:41
217
Stars
0
Stars Increase