학술논문

Enhancing Next Active Object-Based Egocentric Action Anticipation with Guided Attention
Document Type
Conference
Source
2023 IEEE International Conference on Image Processing (ICIP) Image Processing (ICIP), 2023 IEEE International Conference on. :1450-1454 Oct, 2023
Subject
Computing and Processing
Signal Processing and Analysis
Annotations
Human-robot interaction
Feature extraction
Transformers
Spatiotemporal phenomena
Decoding
Data mining
egocentric
action anticipation
transformers
short-term anticipation
next active object
Language
Abstract
Short-term action anticipation (STA) in first-person videos is a challenging task that involves understanding the next active object interactions and predicting future actions. Existing action anticipation methods have primarily focused on utilizing features extracted from video clips, but often overlooked the importance of objects and their interactions. To this end, we propose a novel approach that applies a guided attention mechanism between the objects, and the spatiotemporal features extracted from video clips, enhancing the motion and contextual information, and further decoding the object-centric and motion-centric information to address the problem of STA in egocentric videos. Our method, GANO (Guided Attention for Next active Objects) is a multi-modal, end-to-end, single transformer-based network. The experimental results performed on the largest egocentric dataset demonstrate that GANO outperforms the existing state-of-the-art methods for the prediction of the next active object label, its bounding box location, the corresponding future action, and the time to contact the object. The ablation study shows the positive contribution of the guided attention mechanism compared to other fusion methods. Moreover, it is possible to improve the next active object location and class label prediction results of GANO by just appending the learnable object tokens with the region of interest embeddings. Related implementations are available at: sanketsans.github.io/guided-attention-egocentric.html