Abstractive Text Summarization with Neural Network Based Sequence-to-Sequence Models

  • Pramod Kumar Amaravarapu, Akhil Khare

Abstract

Abstractive text summarization gained popularity due to the popularity of sequence-to-sequence (seq2seq) neural network models. Many improvements have been proposed to seq2seq models, which handles issues, such as fluency and human-like high quality summaries.  Almost all the algorithms differ in neural network architecture, parameter inference and summary generation or decoding. In this paper, we provide a detailed literature survey on various approaches to abstractive text summarization, mainly focusing on the latest developments on seq2seq neural network models.

Keywords: Abstractive test summarization, recurrent neural network, sequence-to-sequence, attention network, reinforcement learning, and beam search.

Published
2020-05-06
How to Cite
Pramod Kumar Amaravarapu, Akhil Khare. (2020). Abstractive Text Summarization with Neural Network Based Sequence-to-Sequence Models. International Journal of Advanced Science and Technology, 29(06), 3129 - 3135. Retrieved from https://sersc.org/journals/index.php/IJAST/article/view/13874