학술논문

Block-term Dropout For Robust Adversarial Defense
Document Type
Conference
Source
2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI) ICTAI Tools with Artificial Intelligence (ICTAI), 2022 IEEE 34th International Conference on. :622-629 Oct, 2022
Subject
Bioengineering
Computing and Processing
Robotics and Control Systems
Tensors
Neural networks
Stochastic processes
Benchmark testing
Network architecture
Robustness
Task analysis
Adversarial Machine Learning
Tensor Decomposition
Dropout
Deep Neural Networks
Language
ISSN
2375-0197
Abstract
Deep neural networks (DNNs) have lately shown tremendous performance in various applications. However, along-side their superiority in these tasks, recent studies have demon-strated that DNNs are easily fooled by adversarial attacks. To guard against adversarial examples, we provide a new solution to hardening DNNs through Block-term Dropout (BT-Dropout), an adversarial defense technique that leverages a latent high-order factorization of the network. Specifically, we impose low-rank block-term tensor structure on the weights of fully-connected layer to obtain compact networks, and then apply BT-Dropout in the latent subspace without pruning the weights directly. Meanwhile, for activation tensor fed into fully-connected layer, Tucker Dropout which can be viewed as a special case of BT-Dropout is introduced to preserve multilinear structure of activations. Furthermore, we show that BT-Dropout implicitly regularizes the tensor decomposition. Comprehensive experiments have demonstrated the effectiveness of our proposed method to improve the adversarial robustness for the models on standard image classification benchmarks.