AIbase
Product LibraryTool Navigation

Question-Answering-with-BERT-and-Knowledge-Distillation

Public

Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher model. Reduced the size of the original BERT by 40%.

Creat2021-02-13T02:59:53
Update2024-09-27T23:42:17
25
Stars
0
Stars Increase