2016. Linguistic Input Features Improve Neural Machine Translation. Proceedings of the First Conference on Machine Translation: Volume 1, Research Papers. Berlin, Germany, pages 83-91. https://www.aclweb.org/anthology/W16-2209/ Vaswani, Ashish and Shazeer, Noam and Parmar, N...
Beyond weight tying: Learning joint input-output embeddings for neural machine translation. In: Proceedings of the third conference on machine translation: research papers. Brussels, pp 73–83 Parikh A, Täckström O, Das D, Uszkoreit J (2016) A decomposable attention model for natural ...
We are working on neural machine translation, using deep neural networks for machine translation. Weachieved human parity(opens in new tab)in translating news from Chinese to English. Our Papers Xu Tan, Yi Ren, Di He, Tao Qin, Tie-Yan Liu,Multilingual Neural Machine Translation with Knowledge...
on Machine Translation: Research Papers, Brussels, 2018, Brussels: Association for Computational Linguistics, 2018, pp. 124–132. https://doi.org/10.18653/v1/W18-6313 Vilar, D., Learning hidden unit contribution for adapting neural machine translation models, Proc. 2018 Conf. of the North ...
2. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly … Towards Interpretable Chit-chat: Open Domain Dialogue Generation with Dialogue Acts W Wu, C Xu, Y Wu, Z Li – 2018 – openreview.net… Traditional research on conversational agents focuses on task-oriented ...
Unsupervised neural machine translation (UNMT) has recently achieved remarkable results with only large monolingual corpora in each language. However, the uncertainty of associating target with source sentences makes UNMT theoretically an ill-posed problem. This work investigates the possibility of utilizin...
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART -- a sequence-to-sequence denoising auto-encoder pre-trained on large-scale monolingual corpora in many languages using the ...
However, SMT systems should not be written off completely as there are many cases where SMT will produce a better quality translation result than NMT. For this reason, Omniscien has taken theHybrid Machine Translationapproach that integrates seamlessly the strengths of both technologies to deliver ...
Tamer, Alkouli, Gabriel Bretschner, and Hermann Ney. "On The Alignment Problem In Multi-Head Attention-Based Neural Machine Translation." Proceedings of the 3rd WMT: Research Papers (2018) Tang, Gongbo, Rico Sennrich, and Joakim Nivre. "An Analysis of Attention Mechanisms: The Case of Word...
This article selected six of the papers submitted by Microsoft Research Asia. Paper topics cover a range of subjects, including: encoder-decoder frameworks, natural language generation, knowledge neurons, extractive text abstracts, pre-trained language models,...