학술논문

Attention Localness in Shared Encoder-Decoder Model For Text Summarization
Document Type
Conference
Source
ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) Acoustics, Speech and Signal Processing (ICASSP), ICASSP 2023 - 2023 IEEE International Conference on. :1-5 Jun, 2023
Subject
Bioengineering
Communication, Networking and Broadcast Technologies
Computing and Processing
Signal Processing and Analysis
Semantics
Redundancy
Pipelines
Signal processing
Linguistics
Decoding
Task analysis
summarization
attention network
transformer
deep learning
Language
ISSN
2379-190X
Abstract
Text summarization is to generate a brief version of a given article while maintaining its essential meaning. Most existing solutions typically relied on the standard attention-based encoder-decoder framework, where each token in the source article, including redundancy, would be contributed to the de-coder through the attention mechanism. It follows that how to filter out the redundant content becomes an important issue in the text summarization task. In this study, we propose a localness attention network, with simplicity and feasibility in mind, which circles different local regions in the source article as contributors in different decoding steps. To further strengthen the localness model, we share the semantic space of the encoder and decoder. The experimental results conducted on two benchmark datasets demonstrate the effectiveness and applicability of the proposed method in relation to several well-practiced works.