학술논문

Human Interaction Anticipation by Combining Deep Features and Transformed Optical Flow Components
Document Type
Periodical
Source
IEEE Access Access, IEEE. 8:137646-137657 2020
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Feature extraction
Cameras
Optical flow
Histograms
Skeleton
Video surveillance
Lighting
Human interaction anticipation
video surveillance
deep learning
transformed optical flow
Language
ISSN
2169-3536
Abstract
The anticipation of ongoing human interactions is not only highly dynamic and challenging problem but extremely crucial in applications such as remote monitoring, video surveillance, human-robot interaction, anti-terrorists and anti-crime securities. In this work, we address the problem of anticipating the interactions between people monitored by single as well as multiple camera views. To this end, we propose a novel approach that integrates Deep Features with novel hand-crafted features, namely Transformed Optical Flow Components (TOFCs). In order to validate the performance of the proposed approach, we have tested the proposed approach in real outdoor environments, captured using single as well as multiple cameras, having shadow and illumination variations as well as cluttered backgrounds. The results of the proposed approach are also compared with the state-of-the-art approaches. The experimental results show that the proposed approach is promising to anticipate real human interactions.