Empirical Evaluation of Physical Adversarial Patch Attacks Against Overhead Object Detection Models

Reports

Empirical Evaluation of Physical Adversarial Patch Attacks Against Overhead Object Detection Models

Andrew Lohn

June 25, 2022

Adversarial patches are images designed to fool otherwise well-performing neural network-based computer vision models. Although these attacks were initially conceived of and studied digitally, in that the raw pixel values of the image were perturbed, recent work has demonstrated that these attacks can successfully transfer to the physical world. This can be accomplished by printing out the patch and adding it into scenes of newly captured images or video footage.

Download Full Report