3)翻译句子
th translate.lua -model model_final.t7 -src data/src-test.txt -output pred.txt
查看指南了解更多:http://opennmt.github.io/Guide
研究
其中主要的模型基于论文 Neural Machine Translation by Jointly Learning to Align and Translate Bahdanau et al. ICLR 2015 和 Effective Approaches to Attention-based Neural Machine Translation, Luong et al. EMNLP 2015。
在基本模型上,还有大量可选项,这都要感谢 SYSTRAN(http://www.systransoft.com/)的出色工作。特别地,下面是
一些实现的功能:
Effective Approaches to Attention-based Neural Machine Translation . Luong et al., EMNLP 2015.
Character-based Neural Machine Translation. Costa-Jussa and Fonollosa, ACL 2016.
Compression of Neural Machine Translation Models via Pruning . See et al., COLING 2016.
Sequence-Level Knowledge Distillation . Kim and Rush., EMNLP 2016.
Deep Recurrent Models with Fast Forward Connections for Neural Machine Translation . Zhou et al, TACL 2016.
Guided Alignment Training for Topic-Aware Neural Machine Translation . Chen et al., arXiv:1607.01628.
Linguistic Input Features Improve Neural Machine Translation . Senrich et al., arXiv:1606.02892
声明
OpenNMT 的实现使用了以下项目的代码:
Andrej Karpathy 的 char-rnn:https://github.com/karpathy/char-rnn
Wojciech Zaremba 的 LSTM:https://github.com/wojzaremba/lstm
Element RNN 库:https://github.com/Element-Research/rnn
证书
MIT