Text-Summarization

Abstractive and Extractive Deep Learning Methods for Text Summarization

Single-Document Summarization

  1. Alexander M. Rush, Sumit Chopra, Jason Weston. A Neural Attention Model for Abstractive Sentence Summarization. EMNLP (2015).

  2. Nallapati, Ramesh, Bing Xiang, and Bowen Zhou. “Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond”. CoNLL (2016).

  3. Sumit Chopra, Alexander M. Rush and Michael Auli. “Abstractive Sentence Summarization with Attentive Recurrent Neural Networks”. NAACL (2016).

  4. Jianpeng Cheng, Mirella Lapata. “Neural Summarization by Extracting Sentences and Words”. ACL(2016)

  5. Kristina Toutanova, Chris Brockett, Ke M. Tran, Saleema Amershi. “A Dataset and Evaluation Metrics for Abstractive Compression of Sentences and Short Paragraphs”. EMNLP (2016).

  6. Abigail See, Peter J. Liu, Christopher D. Manning. “Get To The Point: Summarization with Pointer-Generator Networks”. ACL (2017).

  7. Qingyu Zhou, Nan Yang, Furu Wei, Ming Zhou. “Selective Encoding for Abstractive Sentence Summarization”. ACL (2017)

  8. Nallapati Ramesh, Bing Xiang, and Bowen Zhou. “SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents.” AAAI(2017).

  9. Jiatao Gu, Zhengdong Lu, Hang Li, Victor O.K. Li. “Incorporating Copying Mechanism in Sequence-to-Sequence Learning”. ACL (2016).

  10. Piji Li, Wai Lam, Lidong Bing, Zihao Wang. “Deep Recurrent Generative Decoder for Abstractive Text Summarization”. EMNLP (2017).

  11. Jin-ge Yao, Xiaojun Wan and Jianguo Xiao. “Recent Advances in Document Summarization”. KAIS, survey paper, 2017.

  12. Shibhansh Dohare, Harish Karnick. “Text Summarization using Abstract Meaning Representation”. arxiv 2017

  13. Jeffrey Ling, Alexander M. Rush.”Coarse-to-Fine Attention Models for Document Summarization”. NFiS@EMNLP 2017

  14. Romain Paulus, Caiming Xiong, Richard Socher. “A Deep Reinforced Model for Abstractive Summarization”. ICLR (2018).

  15. Asli Celikyilmaz, Antoine Bosselut, Xiaodong He, Yejin Choi. “Deep Communicating Agents for Abstractive Summarization”. NAACL (2018).

  16. Arman Cohan,Franck Dernoncourt,Doo Soon Kim,Trung Bui,Seokhwan Kim, Walter Chang, Nazli Goharian. “A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents”. NAACL (2018).

Multi-Document Summarization

  1. Peter J. Liu, Mohammad Saleh, Etienne Pot, Ben Goodrich, Ryan Sepassi, Lukasz Kaiser, Noam Shazeer. “Generating Wikipedia by Summarizing Long Sequences”. ICLR (2018)

Meeting Summarization

  1. Abstractive Meeting Summarization with Entailment and Fusion.

  2. A Template-based Abstractive Meeting Summarization: Leveraging Summary and Source Text Relationships.

  3. Domain-Independent Abstract Generation for Focused Meeting Summarization.

  4. Generating and Validating Abstracts of Meeting Conversations: a User Study

  5. Generating Abstractive Summaries from Meeting Transcripts

Basics

  1. Ilya Sutskever, Oriol Vinyals, Quoc V. Le.”Sequence to Sequence Learning with Neural Networks” . NIPS 2014.
  2. Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio. “Neural Machine Translation by Jointly Learning to Align and Translate”. CoRR (2014)
  3. Thang Luong, Hieu Pham, Christopher D. Manning.” Effective Approaches to Attention-based Neural Machine Translation”. EMNLP 2015.

Other References:

Compression

Headline

Extractive