What is it about?
Changing the word order of one language to make it more similar to another helps improve machine translation quality. Neural MT (NMT) is all the rage in MT at the moment, but no-one had tried this for NMT before. Punchline? It doesn't help, not for Japanese<=>English and Chinese<=>English, anyway. But using insights from reordering from the previously dominant approach in MT, statistical MT, does help improve the performance of NMT. In this respect, what we have built is a hybrid MT system.
Featured Image
Why is it important?
MT quality is improving all the time, but it still falls below the quality required for many applications. If we can make the output better, more people are likely to use MT for a range of use-cases. In this paper, we do improve translation quality for Japanese-to-English and Chinese-to-English!
Perspectives
Read the Original
This page is a summary of: Pre-Reordering for Neural Machine Translation: Helpful or Harmful?, Prague Bulletin of Mathematical Linguistics, January 2017, De Gruyter,
DOI: 10.1515/pralin-2017-0018.
You can read the full text:
Contributors
The following have contributed to this page