Saturday, March 25, 2017

Neural machine translation

begin quote from:

Google Neural Machine Translation (GNMT) - which translates "whole sentences at a time,

 

Neural machine translation

From Wikipedia, the free encyclopedia
Neural machine translation (NMT) is an approach to machine translation in which a large neural network is trained by deep learning techniques. It is a radical departure from phrase-based statistical translation approaches, in which a translation system consists of subcomponents that are separately engineered.[1] Google and Microsoft have announced that their translation services are now using NMT in November 2016. Google uses Google Neural Machine Translation (GNMT) in preference to its previous statistical methods.[2] Microsoft uses a similar Deep Neural Network powered Machine Translation technology for all its speech translations (including Microsoft Translator live and Skype Translator). An open source neural machine translation system, OpenNMT,[3] has additionally been released by the Harvard NLP group.
NMT models apply deep representation learning. They require only a fraction of the memory needed by traditional statistical machine translation (SMT) models. Furthermore, unlike conventional translation systems, all parts of the neural translation model are trained jointly (end-to-end) to maximize the translation performance.[4][5][6]
A bidirectional recurrent neural network (RNN), known as an encoder, is used by the neural network to encode a source sentence for a second RNN, known as a decoder, that is used to predict words in the target language.[7]

References


  • Wołk, Krzysztof; Marasek, Krzysztof (2015). "Neural-based Machine Translation for Medical Text Domain. Based on European Medicines Agency Leaflet Texts". Procedia Computer Science. 64 (64): 2–9. doi:10.1016/j.procs.2015.08.456.
  • Navigation menu


  • Lewis-Kraus, Gideon (December 14, 2016). "The Great A.I. Awakening". The New York Times. Retrieved 2016-12-21.

  • "OpenNMT - Open-Source Neural Machine Translation". opennmt.net. Retrieved 2017-03-22.

  • Kalchbrenner, Nal; Blunsom, Philip (2013). "Recurrent Continuous Translation Models". Proceedings of the Association for Computational Linguistics.

  • Sutskever, Ilya; Vinyals, Oriol; Le, Quoc Viet (2014). "Sequence to sequence learning with neural networks". NIPS.

  • Kyunghyun Cho; Bart van Merrienboer; Dzmitry Bahdanau; Yoshua Bengio (3 September 2014). "On the Properties of Neural Machine Translation: Encoder–Decoder Approaches". arXiv:1409.1259Freely accessible [cs.CL].

  • No comments: