This tool could protect your photos from facial recognition

Before and after photographs of, from left, Jessica Simpson, Gwyneth Paltrow and Patrick Dempsey that were cloaked by the Fawkes team.Credit...SAND Lab, University of Chicago

Before and after photographs of, from left, Jessica Simpson, Gwyneth Paltrow and Patrick Dempsey that were cloaked by the Fawkes team.Credit...SAND Lab, University of Chicago

Published Aug 4, 2020

Share

By Kashmir Hill

Researchers at the University of Chicago want you to be able to post selfies without worrying that the next Clearview AI will use them to identify you.

In recent years, companies have been prowling the web for public photos associated with people’s names that they can use to build enormous databases of faces and improve their facial recognition systems, adding to a growing sense that personal privacy is being lost, bit by digital bit.

A start-up called Clearview AI, for example, scraped billions of online photos to build a tool for the police that could lead them from a face to a Facebook account, revealing a person’s identity.

Now researchers are trying to foil those systems. A team of computer engineers at the University of Chicago has developed a tool that disguises photos with pixel-level changes that confuse facial recognition systems.

Named Fawkes in honor of the Guy Fawkes mask favored by protesters worldwide, the software was made available to developers on the researchers’ website last month. After being discovered by Hacker News, it has been downloaded more than 50,000 times. The researchers are working on a free app version for noncoders, which they hope to make available soon.

The software is not intended to be just a one-off tool for privacy-loving individuals. If deployed across millions of images, it would be a broadside against facial recognition systems, poisoning the accuracy of the so-called data sets they gather from the web.

“Our goal is to make Clearview go away,” said Ben Zhao, a professor of computer science at the University of Chicago.

Fawkes converts an image — or “cloaks” it, in the researchers’ parlance — by subtly altering some of the features that facial recognition systems depend on when they construct a person’s face print. In a research paper, reported earlier by OneZero, the team describes “cloaking” photos of the actress Gwyneth Paltrow using the actor Patrick Dempsey’s face, so that a system learning what Ms. Paltrow looks like based on those photos would start associating her with some of the features of Mr. Dempsey’s face.

The changes, usually subtle and not perceptible to the naked eye, would prevent the system from recognizing Ms. Paltrow when presented with a real, uncloaked photo of her. In testing, the researchers were able to fool facial recognition systems from Amazon, Microsoft and the Chinese tech company Megvii.

To test the tool, I asked the team to cloak some images of my family and me. I then uploaded the originals and the cloaked images to Facebook to see if they fooled the social network’s facial recognition system. It worked: Facebook tagged me in the original photo but did not recognize me in the cloaked version.

However, the changes to the photos were noticeable to the naked eye. In the altered images, I looked ghoulish, my 3-year-old daughter sprouted what looked like facial hair, and my husband appeared to have a black eye.

The researchers had a few explanations for this. One is that the software is designed to match you with the face template of someone who looks as much unlike you as possible, pulling from a database of celebrity faces. That usually ends up being a person of the opposite sex, which leads to obvious problems.

“Women get mustaches, and guys get extra eyelashes or eye shadow,” Mr. Zhao said. He is enthusiastic about what he calls “privacy armor” and previously helped design a bracelet that stops smart speakers from overhearing conversations.

The team says it plans to tweak the software so that it will no longer subtly change the sex of users.

The other issue is that my experiment wasn’t what the tool was designed to do, so Shawn Shan, a Ph.D. student at the University of Chicago who is one of the creators of the Fawkes software, made the changes to my photos as extreme as possible to ensure that it worked. Fawkes isn’t intended to keep a facial recognition system like Facebook’s from recognizing someone in a single photo. It’s trying to more broadly corrupt facial recognition systems, performing an algorithmic attack called data poisoning.

The researchers said that, ideally, people would start cloaking all the images they uploaded. That would mean a company like Clearview that scrapes those photos wouldn’t be able to create a functioning database, because an unidentified photo of you from the real world wouldn’t match the template of you that Clearview would have built over time from your online photos.

But Clearview’s chief executive, Hoan Ton-That, ran a version of my Facebook experiment on the Clearview app and said the technology did not interfere with his system. In fact, he said, his company could use images cloaked by Fawkes to improve its ability to make sense of altered images.

“There are billions of unmodified photos on the internet, all on different domain names,” Mr. Ton-That said. “In practice, it’s almost certainly too late to perfect a technology like Fawkes and deploy it at scale.”

Other experts were also skeptical that Fawkes would work. Joseph Atick, a facial recognition pioneer who has come to regret the surveillance society he helped to create, said the volume of images of ourselves that we had already made available would be too hard to overcome.

“The cat is out of the bag. We’re out there,” Dr. Atick said. “While I encourage this type of research, I’m highly skeptical this is a solution to solve the problem that we’re faced with.”

Dr. Atick thinks that only lawmakers can ensure that people have a right to facial anonymity. No such federal law is on the horizon, though Democratic senators did recently propose a ban on government use of facial recognition.

“I personally think that no matter which approach you use, you lose,” said Emily Wenger, a Ph.D. student who helped create Fawkes. “You can have these technological solutions, but it’s a cat-and-mouse game. And you can have a law, but there will always be illegal actors.”

Ms. Wenger thinks “a two-prong approach” is needed, where individuals have technological tools and a privacy law to protect themselves.

Elizabeth Joh, a law professor at the University of California, Davis, has written about tools like Fawkes as “privacy protests,” where individuals want to thwart surveillance but not for criminal reasons. She has repeatedly seen what she called a “tired rubric" of surveillance, then countersurveillance and then anti-countersurveillance, as new monitoring technologies are introduced.

“People are feeling a sense of privacy exhaustion,” Ms. Joh said. “There are too many ways that our conventional sense of privacy is being exploited in real life and online.”

For Fawkes to have an immediate effect, we would need all the photos of ourselves that we had already posted to be cloaked overnight. That could happen if a huge platform that maintains an enormous number of online images decided to roll out Fawkes systemwide.

A platform like Facebook’s adopting Fawkes would prevent a future Clearview from scraping its users’ images to identify them. “They could say, ‘Give us your real photos, we’ll cloak them, and then we’ll share them with the world so you’ll be protected,’” Mr. Zhao said.

Jay Nancarrow, a Facebook spokesman, did not rule out that possibility when asked for comment. “As part of our efforts to protect people’s privacy, we have a dedicated team exploring this type of technology and other methods of preventing photo misuse,” Mr. Nancarrow said.

“I’m actually interning on that exact team at Facebook right now,” said the Fawkes co-creator Mr. Shan.

THE NEW YORK TIMES