Advertisment

A Real-time Invisibility Cloak that can Hide you from AI Cameras

Recently a team of researchers at the University of Maryland, College Park, working with Facebook Artificial Intelligence, developed a real-life "invisibility cloak''.

author-image
Kapish Khajuria
New Update
A Real time Invisibility Cloak that can hide you from AI Cameras 1

Recently a team of researchers at the University of Maryland, College Park, working with Facebook Artificial Intelligence, developed a real-life "invisibility cloak''. The cloak is actually a colourful sweater that deletes you right out of a machine's vision.

Advertisment

Although Invisibility cloaks have always been part of science fiction it seems you might be able to experience it in your present life.

Notably, the research team used adversarial patterns on the sweater that evade most common object detectors, making the person undetectable, according to a Gagadget report. Simply put, the sweater makes a person 'invisible' in front of the AI models that detect people.

The developers started out with the original goal of testing machine learning systems for vulnerabilities, however, the result was a print on clothes that AI cameras can't see. A user on Reddit shared a video of the test footage, with a caption that reads, '' This sweater developed by the University of Maryland utilizes "adversarial patterns " to become an invisibility cloak against AI.''

Advertisment

"This stylish pullover is a great way to stay warm this winter whether in the office or on the go. It features a stay-dry microfleece lining, a modern fit, and an adversarial pattern that evades the most common object detectors. In demonstration, the YOLOv2 detector is evaded using a pattern trained on the COCO dataset with a carefully constructed objective," the team said.

How does the technology behind it work?

Further, they explained that they used the SOCO dataset on which the computer vision algorithm YOLOv2 is trained and identified a pattern that helps to recognize a person. The same created an opposite pattern and transformed it into an image-a print on a sweater. As a result, the owner of such a sweater can hide from detection systems.

Advertisment

"Most work on real-world adversarial attacks has focused on classifiers, which assign a holistic label to an entire image, rather than detectors which localize objects within an image," the team wrote on the University of Maryland website.

They also explained that the detectors work by considering thousands of 'priors' (potential bounding boxes) within the image with different locations, sizes, and aspect ratios. To fool an object detector, an adversarial example must fool every prior in the image, which is much more difficult than fooling the single output of a classifier.

Though many on Reddit were fascinated by the ''magic sweater'', others questioned its effectiveness. One user joked about it and said, ''So ugly even AI doesn't want to see it.'' Another commented, '' I mean. invisibility seems a bit pushing it. The camera is still recognizing him, just not 100%…. Am I wrong in thinking, let's say if police were using this to find criminals?''

Moreover, the YOLOv2-targeting adversarial sweatshirts hit only around a 50 per cent success rate in the wearable test, according to a Hackster report.

Advertisment