학술논문

Performance and Analysis of Optical Flow Techniques for Moving Object Segmentation & Detection in Infrared Tracking Method
Document Type
Conference
Source
2023 4th International Conference on Smart Electronics and Communication (ICOSEC) Smart Electronics and Communication (ICOSEC), 2023 4th International Conference on. :1213-1220 Sep, 2023
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Fields, Waves and Electromagnetics
Signal Processing and Analysis
Resistance
Optical microscopy
Target tracking
Video sequences
Object segmentation
Adaptive optics
Motion compensation
optical flow at several scales and identification of microscopic targets using infrared video
Language
Abstract
For the detection of small targets using video infrared (IR), Between video sequences, the spatio-temporal information is crucial. In order to estimate motion and identify tiny targets in IR, optical flow is frequently used correction in order to efficiently utilise the additional temporal information. The standard optical Nevertheless, the broad viewing distance and poor IR imaging speed cause the target's spatial location to vary often between two frames, which reduces the effectiveness of optical flow-based detection techniques. Our solution, an end-to-end video infrared microscopic target identification system is one that is more resistant to significant motion and capable of more precise motion compensation. Flow-based detection techniques are only able to detect very slight motion in video sequences. Nevertheless, the broad viewing distance and poor IR imaging speed cause the target's spatial location to vary often between two frames, which reduces the effectiveness of optical flow-based detection techniques. The end-to-end video infrared microscopic target identification system we propose is more robust to large motion and capable of more precise motion compensation. To do motion estimates in a more accurate manner, we first advise employing an optical flow reconstruction network with many scales. Finally, the neighbouring frames are synchronised to the reference frame using the generated optical flows. The detection network is then given the concatenated neighbourhood frames to obtain the detection results.