Deep Communicating Agents for Abstractive Summarization

Abstract

We present deep communicating agents in an encoder-decoder architecture to address the challenges of representing a long document for abstractive summarization. With deep communicating agents, the task of encoding a long text is divided across multiple collaborating agents, each in charge of a subsection of the input text. These encoders are connected to a single decoder, trained end-to-end using RL to generate a focused and coherent summary. Empirical results demonstrate that multiple communicating encoders lead to a higher quality summary.

Publication
Proceedings of the 16th Annual Meeting of the North American Association for Computational Linguistics (NAACL)