Physical Adversarial Attacks of Diffractive Deep Neural Networks
TimeWednesday, December 8th6:00pm - 7:00pm PST
LocationLevel 2 - Lobby
Event Type
Late Breaking Results Poster
Networking Reception
Virtual Programs
Presented In-Person
DescriptionDiffractive Deep Neural Network (D2NNs) can work as a neural network with the diffraction of light and have demonstrated significant performance improvements in energy efficiency in broad security-sensitive applications. In this work, we develop the first adversarial attack formulations over optical physical meanings over optical domains, i.e. Phase attack, Amplitude attack, and Complex-domain attack, which can be realized in D2NNs system using amplitude and phase modulators. We demonstrate that the proposed Complex Fast Gradient Sign Method can successfully generate minimal-changed physically feasible adversarial examples targeting pre-trained D2NNs model on MNIST-10 dataset, which bring down its accuracy to <20% from 95.4%.