AIbase
Product LibraryTool Navigation

Chinese-Tokenization

Public

利用传统方法(N-gram,HMM等)、神经网络方法(CNN,LSTM等)和预训练方法(Bert等)的中文分词任务实现【The word segmentation task is realized by using traditional methods (n-gram, HMM, etc.), neural network methods (CNN, LSTM, etc.) and pre training methods (Bert, etc.)】

Creat2022-04-05T21:29:47
Update2020-07-12T09:07:25
32
Stars
0
Stars Increase