Analyses and Modeling of Deep Learning Neural Networks for Sequence-to-Sequence Translation

  • R Lokeshkumar, K Jayakumar, Vishal Prem, Mark Sheridan Nonghuloo

Abstract

Statistical machine translation has allowed us to construct models for translating a sequence of text from one language into another, but this model has its limitations in terms of translating material that is not similar to content from the training corpora. This leads to the use of neural machine translation that uses Encoder Decoder Network on a sequence of text to output a sequence of text in a different language. In this paper, we propose a neural network model called LSTM Encoder– Decoder that consists of two long short-term memory networks (LSTM). One network will encode a sequence of character into a fixed length vector representation, and the other decodes the representation into another sequence of characters.

Keywords: Deep Learning, Recurrent Neural Network, LSTM, Neural Machine Translation, Semi-Supervised Machine Learning.

Published
2020-04-25
How to Cite
R Lokeshkumar, K Jayakumar, Vishal Prem, Mark Sheridan Nonghuloo. (2020). Analyses and Modeling of Deep Learning Neural Networks for Sequence-to-Sequence Translation. International Journal of Advanced Science and Technology, 29(05), 3152 - 3159. Retrieved from http://sersc.org/journals/index.php/IJAST/article/view/11633