Tuesday, January 21, 2020

"Boffins don bad 1980s fashion to avoid being detected by object-recognizing AI cameras"

From The Register(keeping the word 'boffins' alive):

Adversarial T-shirt disguise hoodwinks machine-learning algorithms
In a pleasing symmetry, boffins have used machine-learning algorithms to develop a T-shirt design that causes its wearer to evade detection by object-recognition cameras.

Brainiacs at Northeastern University, MIT, and IBM Research in the US teamed up to create the 1980s-esque fashion statement, according to a paper quietly emitted via arXiv in mid-October. Essentially, the aim is to create a T-shirt that fools AI software into not detecting and classifying the wearer as a person. This means the wearer could slip past visitor or intruder detection systems, and so on.

The T-shirt design is a classic adversarial example. That means the pattern on the shirt has been carefully designed to manipulate just the right parts of a detection system's neural network to make it misidentify the wearer. Previous adversarial experiments have typically involved flat or rigid 2D or 3D images or objects, like stickers or toy turtles.

Now, this team, at least, has shown it’s possible to trick computer-vision models with more flexible materials like T-shirts, too.

“We highlight that the proposed adversarial T-shirt is not just a T-shirt with printed adversarial patch for clothing fashion, it is a physical adversarial wearable designed for evading person detectors in a real world,” the paper said.

In this case, the adversarial T-shirt helped a person evade detection. The two convolutional neural networks tested, YOLOv2 and Faster R-CNN, have been trained to identify objects. Under normal circumstances, when it’s given a photo containing people it should be able to draw a bounding box around them, labeling them as “person”.

But you can trick the system and avoid being noticed at all by wearing the adversarial T-shirt. “Our method aims to make the person's bounding box vanish rather than misclassify the person as a wrong label,” an IBM spokesperson told The Register....
....MORE
If interested see also:
Adversarial Images, Or How To Fool Machine Vision
"Magic AI: These are the Optical Illusions that Trick, Fool, and Flummox Computers"
Fooling The Machine: The Byzantine Science of Deceiving Artificial Intelligence
Another Way To Fool The Facial Recognition Algos
News You Can Use—"Invisible Mask: Practical Attacks on Face Recognition with Infrared"
Disrupting Surveillance Capitalism

And finally, the essential "Machine Learning and the Importance of 'Cat Face'". 

Have I gotten a bit obsessed with beating the machines?
Not if you are serious about being able to walkabout without having the the voyeurs scoping your every move:
Cargill Invests In Facial Recognition For Cows
Bovine adversarial image teams around the world are working feverishly to beat the machines with ideas ranging from the (udderly) ridiculous:

https://78.media.tumblr.com/a4a468f554d88783e49704847aaa7e6b/tumblr_npkumgzPPk1upbn1no1_1280.jpghttps://i.pinimg.com/736x/6d/2b/16/6d2b16a7ebc25542b387d27ac958b109--funny-cows-mundo-animal.jpg

To the....actually they're all ridiculous.

Hiding from the cameras may be the only foolproof technique:

https://img00.deviantart.net/cdfc/i/2008/247/5/e/cow_hide_by_quanticchaos1000.jpg

We'll leave the ruminators something to think about with their Russian sisters exploring virtual reality:
https://static.independent.co.uk/s3fs-public/thumbnails/image/2019/11/26/14/urqvy4uxj8ayrfx1mx7t3thseavocfgf5q6e0t3ngeg5lmvio4a9wmqdsaxwfmvjvk1fbzvnvcxc6nuexpn3w6ffa9dmztcr.jpg?w968