AIbase
Product LibraryTool Navigation

INTER-INTRA-attentions

Public

An experimental custom seq-2-seq model with both layer-wise (inter-layer), and intra-layer attention (attention to previous hidden states of the same RNN unit) for abstractive summarization.

Creat2017-10-09T23:16:54
Update2024-10-24T15:20:52
10
Stars
0
Stars Increase

Related projects