AIbase
Product LibraryTool Navigation

Neural-Machine-Translation-with-Attention

Public

I implement encoder-decoder based seq2seq models with attention using Keras. The encoder can be a Bidirectional LSTM, a simple LSTM, or a GRU, and the decoder can be an LSTM or a GRU. I evaluate the models on an English-French dataset.

Creat2020-08-29T00:55:03
Update2024-02-29T12:55:03
3
Stars
0
Stars Increase

Related projects