AIbase
Product LibraryTool Navigation

MoE-plus-plus

Public

[ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts

Creat2024-10-08T15:49:40
Update2025-03-27T01:48:30
https://arxiv.org/abs/2410.07348
198
Stars
0
Stars Increase

Related projects