학술논문

Adversarial Barrel! An Evaluation of 3D Physical Adversarial Attacks
Document Type
Conference
Source
2022 IEEE Applied Imagery Pattern Recognition Workshop (AIPR) Applied Imagery Pattern Recognition Workshop (AIPR), 2022 IEEE. :1-6 Oct, 2022
Subject
Computing and Processing
General Topics for Engineers
Signal Processing and Analysis
Training
Solid modeling
Computer vision
Three-dimensional displays
Computational modeling
Transportation
Object detection
adversarial machine learning
ML
attacks
baseline
computer vision
AI
security
Language
ISSN
2332-5615
Abstract
Computer vision models based on Deep Neural Networks (DNNs) are vulnerable to adversarial attacks. It has also been demonstrated that physical adversarial attacks can affect computer vision models though printed medium or physical 3D objects. However, the efficacy of physical adversarial attacks, is highly variable under real-world conditions. In this research, we leverage a synthetic validation environment to evaluate 2D and 3D physical adversarial attacks on state-of the-art object detection models (Faster-RCNN, RetinaNet, YOLOv3, YOLOv4). Using the Unreal Engine, we create synthetic environments to evaluate the limitations of physical adversarial attacks. We evaluate 2D adversarial patches under varying lighting conditions and poses. We optimize the same adversarial attacks for 3D shapes including a pyramid, a cube, and a barrel (cylinder), and evaluate the robustness of the 3D physical attacks against the 2D attack baseline. We test our attacks against object-detection models trained on MSCOCO, VIRAT, VISDRONE, and synthetic datasets. By advancing physical adversarial attacks and validation-methodology, we improve our ability to red-team computer vision models with a goal toward defending and assuring AI systems used in fields like Transportation, and Security.