AIbase
Product LibraryTool Navigation

dapt

Public

Code for "On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models"

Creat2023-07-13T06:50:16
Update2024-04-07T22:41:33
4
Stars
0
Stars Increase