학술논문

Graph Neural Networks for Cross-Camera Data Association
Document Type
Periodical
Source
IEEE Transactions on Circuits and Systems for Video Technology IEEE Trans. Circuits Syst. Video Technol. Circuits and Systems for Video Technology, IEEE Transactions on. 33(2):589-601 Feb, 2023
Subject
Components, Circuits, Devices and Systems
Communication, Networking and Broadcast Technologies
Computing and Processing
Signal Processing and Analysis
Cameras
Task analysis
Image edge detection
Three-dimensional displays
Message passing
Graph neural networks
Feature extraction
Data association
cross-camera
graph neural network
message passing network
Language
ISSN
1051-8215
1558-2205
Abstract
Cross-camera image data association is essential for many multi-camera computer vision tasks, such as multi-camera pedestrian detection, multi-camera multi-target tracking, 3D pose estimation, etc. This association task is typically modeled as a bipartite graph matching problem and often solved by applying minimum-cost flow techniques, which may be computationally demanding for large data. Furthermore, cameras are usually treated by pairs, obtaining local solutions, rather than finding a global solution at once for all multiple cameras. Other key issue is that of the affinity function: the widespread usage of non-learnable pre-defined distances, such as the Euclidean and Cosine ones. This paper proposes an effective approach for cross-camera data-association focused on a global solution, instead of processing cameras by pairs. To avoid the usage of fixed distances and thresholds, we leverage the connectivity of Graph Neural Networks, previously unused in this scope, using a Message Passing Network to jointly learn features and similarity functions. We validate the proposal for pedestrian cross-camera association, showing results over the EPFL multi-camera pedestrian dataset. Our approach considerably outperforms the literature data association techniques, without requiring to be trained in the same scenario in which it is tested. Our code is available at https://www-vpu.eps.uam.es/publications/gnn