It is possible to fool face-recognition (FR) systems into misidentifying one person A as some specified other person B by projecting an pattern of infrared light onto A's face when the recognizer's camera photographs it, creating a customized adversarial example. Since light in the near infrared can be detected by surveillance cameras but not by human eyes, other people cannot detect the masquerade, even at close range. To project the light patterns, researchers had person A wear a baseball cap with tiny infrared LEDs tucked up under the bill.
“Invisible Mask: Practical Attacks on Face Recognition with Infrared”
Zhe Zhou, Di Tang, Xiaofeng Wang, Weili Han, Xiangyu Liu, and Kehuan Zhang, arXiv, March 13, 2018
In this paper, we present the first approach that makes it possible to apply [an] automatically-identified, unique adversarial example to [a] human face in an inconspicuous way [that is] completely invisible to human eyes. As a result, the adversary masquerading as someone else will be able to walk on the street, without any noticeable anomaly to other individuals[,] but appearing to be a completely different person to the FR system behind surveillance cameras.