Abstractive Text Summarization with Neural Network Based Sequence-to-Sequence Models
Abstract
Abstractive text summarization gained popularity due to the popularity of sequence-to-sequence (seq2seq) neural network models. Many improvements have been proposed to seq2seq models, which handles issues, such as fluency and human-like high quality summaries. Almost all the algorithms differ in neural network architecture, parameter inference and summary generation or decoding. In this paper, we provide a detailed literature survey on various approaches to abstractive text summarization, mainly focusing on the latest developments on seq2seq neural network models.
Keywords: Abstractive test summarization, recurrent neural network, sequence-to-sequence, attention network, reinforcement learning, and beam search.