Peer-Reviewed Journal Details
Mandatory Fields
Jinhua Du, Andy Way
2017
June
PRAGUE BULLETIN OF MATHEMATICAL LINGUISTICS
Pre-Reordering for Neural Machine Translation: Helpful or Harmful?
Published
()
Optional Fields
108
1
171
182
Pre-reordering, a preprocessing to make the source-side word orders close to those of the target side, has been proven very helpful for statistical machine translation (SMT) in improving translation quality. However, is it the case in neural machine translation (NMT)? In this paper, we firstly investigate the impact of pre-reordered source-side data on NMT, and then propose to incorporate features for the pre-reordering model in SMT as input factors into NMT (factored NMT). The features, namely parts-of-speech (POS), word class and reordered index, are encoded as feature vectors and concatenated to the word embeddings to provide extra knowledge for NMT. Pre-reordering experiments conducted on Japanese↔English and Chinese↔English show that pre-reordering the source-side data for NMT is redundant and NMT models trained on pre-reordered data deteriorate translation performance. However, factored NMT using SMT-based pre-reordering features on Japanese→English and Chinese→English is beneficial and can further improve by 4.48 and 5.89 relative BLEU points, respectively, compared to the baseline NMT system.
https://www.degruyter.com/downloadpdf/j/pralin.2017.108.issue-1/pralin-2017-0018/pralin-2017-0018.pdf
10.1515/pralin-2017-0018
Grant Details
Science Foundation Ireland (SFI)