AIbase
Product LibraryTool Navigation

Law-OMNI-BERT-Project

Public

Directly applying advancements in transfer learning from BERT results in poor accuracy in domain-specific areas like law because of a word distribution shift from general domain corpora to domain-specific corpora. In our project, we will demonstrate how the pre-trained language model BERT can be adapted to additional domains, such as contract law or court judgments.

Creat2019-11-17T06:38:28
Update2025-02-21T03:01:39
7
Stars
0
Stars Increase