Abstractive and Extractive Deep Learning Methods for Text Summarization
Alexander M. Rush, Sumit Chopra, Jason Weston. A Neural Attention Model for Abstractive Sentence Summarization. EMNLP (2015).
Nallapati, Ramesh, Bing Xiang, and Bowen Zhou. “Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond”. CoNLL (2016).
Sumit Chopra, Alexander M. Rush and Michael Auli. “Abstractive Sentence Summarization with Attentive Recurrent Neural Networks”. NAACL (2016).
Jianpeng Cheng, Mirella Lapata. “Neural Summarization by Extracting Sentences and Words”. ACL(2016)
Kristina Toutanova, Chris Brockett, Ke M. Tran, Saleema Amershi. “A Dataset and Evaluation Metrics for Abstractive Compression of Sentences and Short Paragraphs”. EMNLP (2016).
Abigail See, Peter J. Liu, Christopher D. Manning. “Get To The Point: Summarization with Pointer-Generator Networks”. ACL (2017).
Qingyu Zhou, Nan Yang, Furu Wei, Ming Zhou. “Selective Encoding for Abstractive Sentence Summarization”. ACL (2017)
Nallapati Ramesh, Bing Xiang, and Bowen Zhou. “SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents.” AAAI(2017).
Jiatao Gu, Zhengdong Lu, Hang Li, Victor O.K. Li. “Incorporating Copying Mechanism in Sequence-to-Sequence Learning”. ACL (2016).
Piji Li, Wai Lam, Lidong Bing, Zihao Wang. “Deep Recurrent Generative Decoder for Abstractive Text Summarization”. EMNLP (2017).
Jin-ge Yao, Xiaojun Wan and Jianguo Xiao. “Recent Advances in Document Summarization”. KAIS, survey paper, 2017.
Shibhansh Dohare, Harish Karnick. “Text Summarization using Abstract Meaning Representation”. arxiv 2017
Jeffrey Ling, Alexander M. Rush.”Coarse-to-Fine Attention Models for Document Summarization”. NFiS@EMNLP 2017
Romain Paulus, Caiming Xiong, Richard Socher. “A Deep Reinforced Model for Abstractive Summarization”. ICLR (2018).
Asli Celikyilmaz, Antoine Bosselut, Xiaodong He, Yejin Choi. “Deep Communicating Agents for Abstractive Summarization”. NAACL (2018).
Arman Cohan,Franck Dernoncourt,Doo Soon Kim,Trung Bui,Seokhwan Kim, Walter Chang, Nazli Goharian. “A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents”. NAACL (2018).
Abstractive Meeting Summarization with Entailment and Fusion.
Domain-Independent Abstract Generation for Focused Meeting Summarization.
Generating and Validating Abstracts of Meeting Conversations: a User Study
Wenyuan Zeng, Wenjie Luo, Sanja Fidler, Raquel Urtasun. “Efficient Summarization with Read-Again and Copy Mechanism.” arXiv (2016)
Suzuki Jun, and Masaaki Nagata. “Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization”. EACL (2017)
Linqing Liu, Yao Lu, Min Yang, Qiang Qu, Jia Zhu, Hongyan Li. “Generative Adversarial Network for Abstractive Text Summarization”. To appear
Xinyu Hua, Lu Wang. A Pilot Study of Domain Adaptation Effect for Neural Abstractive Summarization NFiS@EMNLP 2017
Angela Fan, David Grangier, Michael Auli Controllable Abstractive Summarization
Ziqiang Cao, Furu Wei, Wenjie Li, Sujian Li Faithful to the Original: Fact Aware Neural Abstractive Summarization
Reader-Aware Multi-Document Summarization: An Enhanced Model and The First Dataset Piji Li, Lidong Bing, Wai Lam
Shuming Ma, Xu Sun, Jingjing Xu, Houfeng Wang, Wenjie Li, Qi Su Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization
Look-ahead Attention for Generation in Neural Machine Translation
Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization Minsoo Kim, Moirangthem Dennis Singh, Minho Lee
Fei Liu, Jeffrey Flanigan, Sam Thomson, Norman Sadeh, and Noah A. Smith. Toward Abstractive Summarization Using Semantic Representations. NAACL 2015
Compression
Headline
Extractive