Conference Proceedings

Adversarial Camouflage: Hiding Physical-World Attacks with Natural Styles

Ranjie Duan, Xingjun Ma, Yisen Wang, James Bailey, AK Qin, Yun Yang

2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | IEEE | Published : 2020

Abstract

Deep neural networks (DNNs) are known to be vulnerable to adversarial examples. Existing works have mostly focused on either digital adversarial examples created via small and imperceptible perturbations, or physical-world adversarial examples created with large and less realistic distortions that are easily identified by human observers. In this paper, we propose a novel approach, called Adversarial Camouflage (AdvCam), to craft and camouflage physicalworld adversarial examples into natural styles that appear legitimate to human observers. Specifically, AdvCam transfers large adversarial perturbations into customized styles, which are then "hidden"on-target object or off-target background. ..

View full abstract

Grants

Awarded by Australian Research Council Discovery Project


Funding Acknowledgements

Yun Yang is supported by Australian Research Council Discovery Project under Grant DP180100212. We are also grateful for early stage discussions with Dr ShengWen from Swinburne University of Technology.