Chinese Students Invent Invisibility Cloak

China has a notorious governmental surveillance apparatus that is renowned to violate its citizens’ privacy and target the regime’s political adversaries. Now, Chinese students have invented an invisibility cloak.

Chinese Students Invent Invisibility Cloak

To the untrained eye, it appears to be any other camouflage design coat. However, this is an invisibility cloak that adequately hides the person wearing it to artificial intelligence security cameras.

During the day, the coat’s unique camouflage prints, which were created using an algorithm, are undetected by visible light cameras. By night, when security cameras use infrared thermal imaging to identify individuals, the coat’s internal thermal devices emit varied temperatures, creating a distinctive heat signature that enables the coat to slip under the radar.

The InvisDefense coat, created by a group of four graduate students from China’s Wuhan University, was one of the proposals that won first place in the “Huawei Cup,” an inaugural cybersecurity innovation event sponsored by Chinese technology giant Huawei.

Wei Hui, the computer science graduate student who built the coat’s main algorithm, told VICE World News, “We spent a lot of energy preparing for this, including this product’s design and development.” He claims that the InvisDefense cloak is a “novel” technique to avoid existing security cameras’ AI human identification technology.

The effectiveness of pedestrian identification was lowered by 57% when the students assessed the coat on campus security cameras. One of the most difficult aspects of constructing the coat, according to the researchers, was achieving a balance between tricking both the camera and the human eyes.

“We had to use an algorithm to design a least conspicuous image that could render camera vision ineffective,” Wei said.

China has a notorious governmental surveillance apparatus that is renowned to violate its citizens’ privacy and target the regime’s political adversaries. In 2019, the nation was home to eight of the world’s top 10 most monitored cities. AI identification technologies are now used by both the government and businesses, from identifying “suspicious” Muslims in Xinjiang to deterring youngsters from late-night gaming.

There has been some opposition, but it has not been much. In the nation’s first-ever case contesting the use of facial recognition technology, a law professor in 2020 successfully sued a zoo in Hangzhou for gathering visitors’ facial biometric data without their agreement.

The Wuhan University researchers considered similar privacy issues when creating the InvisDefense coat, which will cost roughly 500 yuan ($71).

“Security cameras using AI technology are everywhere. They pervade our lives,” said Wei. “Our privacy is exposed under machine vision.”

“We designed this product to counter malicious detection, to protect people’s privacy and safety in certain circumstances.”

The team’s future research ambitions, according to Wei, involve rendering inanimate things and moving cars “invisible” to AI cameras. They are also investigating ways to get around cameras that rely on remote sensing, satellites, or aircraft.

The researchers, who are Chinese citizens, do not seem to be trying to undermine the state’s extensive monitoring apparatus, though. In fact, the team claims that they want to make it stronger.

“The fact that security cameras cannot detect the InvisDefense coat means that they are flawed,” said Wei. “We are also working on this project to stimulate the development of existing machine vision technology, because we’re basically finding loopholes.”

Do you have a tip or sensitive material to share with GGI? Are you a journalist, researcher or independent blogger and want to write for us? You can reach us at [email protected].

One Response

  1. Many Chinese worldwide are not necessarily CPP fans, and no one knows this better than President Xi. IMHO that REMOVES the excuse that the language, tough as it is, shouldn’t be required K-12 in U.S. magnet schools.

Leave a Reply