Marchisio, A., Caramia, G., Martina, M., & Shafique, M. (2022). fakeWeather: Adversarial Attacks for Deep Neural Networks Emulating Weather Conditions on the Camera Lens of Autonomous Systems. In Proceedings 2022 International Joint Conference on Neural Networks (IJCNN) (pp. 1–9). https://doi.org/10.1109/IJCNN55064.2022.9892612
E191-02 - Forschungsbereich Embedded Computing Systems
-
Published in:
Proceedings 2022 International Joint Conference on Neural Networks (IJCNN)
-
ISBN:
978-1-7281-8671-9
-
Volume:
2022-July
-
Date (published):
2022
-
Event name:
2022 International Joint Conference on Neural Networks (IJCNN)
en
Event date:
18-Jul-2022 - 23-Jul-2022
-
Event place:
Padua, Italy
-
Number of Pages:
9
-
Keywords:
Adversarial Attacks; Deep Neural Networks; Hail; Rain; Snow; Weather
en
Abstract:
Recently, Deep Neural Networks (DNNs) have achieved remarkable performances in many applications, while several studies have enhanced their vulnerabilities to malicious attacks. In this paper, we emulate the effects of natural weather conditions to introduce plausible perturbations that mislead the DNNs. By observing the effects of such atmospheric perturbations on the camera lenses, we model the patterns to create different masks that fake the effects of rain, snow, and hail. Even though the perturbations introduced by our attacks are visible, their presence remains unnoticed due to their association with natural events, which can be especially catastrophic for fully-autonomous and unmanned vehicles. We test our proposed fake Weather attacks on multiple Convolutional Neural Network and Capsule Network models, and report noticeable accuracy drops in the presence of such adversarial perturbations. Our work introduces a new security threat for DNNs, which is especially severe for safety-critical applications and autonomous systems.
en
Research Areas:
Computer Engineering and Software-Intensive Systems: 100%