AIbase
Biblioteca de productosNavegación de herramientas

BERT-models-finetuning

Public

BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.

Hora de creación2020-07-29T19:52:46
Hora de actualización2024-03-24T05:17:26
2
Stars
0
Stars Increase

Proyectos relacionados