AIbase
Product LibraryTool Navigation

SMPyBandits

Public

? Research Framework for Single and Multi-Players ? Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-player (MusicalChair, MEGA, rhoRand, MCTop/RandTopM etc).. Available on PyPI: https://pypi.org/project/SMPyBandits/ and documentation on

Creat2016-11-17T21:27:49
Update2025-03-21T15:12:59
https://SMPyBandits.github.io/
400
Stars
0
Stars Increase