AIbase
Product LibraryTool Navigation

KVQuant

Public

[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization

Creat2024-02-01T01:30:10
Update2025-03-25T10:33:15
https://arxiv.org/abs/2401.18079
339
Stars
1
Stars Increase