AIbase
Product LibraryTool Navigation

BERT-models-finetuning

Public

BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.

Creat2020-07-29T19:52:46
Update2024-03-24T05:17:26
2
Stars
0
Stars Increase

Related projects