학술논문

Butterfly: Multiple Reference Frames Feature Propagation Mechanism for Neural Video Compression
Document Type
Conference
Source
2023 Data Compression Conference (DCC) DCC Data Compression Conference (DCC), 2023. :198-207 Mar, 2023
Subject
Communication, Networking and Broadcast Technologies
Signal Processing and Analysis
Codecs
Fuses
Bit rate
Data compression
Video compression
Encoding
Low latency communication
Language
ISSN
2375-0359
Abstract
Using more reference frames can significantly improve the compression efficiency in neural video compression. However, in low-latency scenarios, most existing neural video compression frameworks usually use the previous one frame as reference. Or a few frameworks which use the previous multiple frames as reference only adopt a simple multi-reference frames propagation mechanism. In this paper, we present a more reasonable multi-reference frames propagation mechanism for neural video compression, called butterfly multi-reference frame propagation mechanism (Butterfly), which allows a more effective feature fusion of multireference frames. By this, we can generate more accurate temporal context conditional prior for Contextual Coding Module. Besides, when the number of decoded frames does not meet the required number of reference frames, we duplicate the nearest reference frame to achieve the requirement, which is better than duplicating the furthest one. Experiment results show that our method can significantly outperform the previous state-of-the-art (SOTA), and our neural codec can achieve -7.6% bitrate save on HEVC Class D dataset when compares with our base single-reference frame model with the same compression configuration.