Amid increased attempts to outsmart facial recognition technology, which has been used by law enforcement to monitor protest gatherings and identify suspects of criminal activity, people have sought ways to guard their anonymity. A new study out of Ben-Gurion University in Israel outlines how “adversarial makeup” can fool some facial recognition systems in such a way that they cannot identify individuals who have been pre-identified in a system.
In the study, researchers sought to find a way to evade facial recognition that doesn’t evade suspicion by human observers, like an airport security officer, who will ask you to take down the face mask you’re probably wearing in these times. What they found is that makeup, applied in a certain way, can make someone’s face more difficult to identify.
Unique biometrics — The basic idea is that facial recognition works by identifying the subtle points which define one face from another. Your eyebrows are a certain distance apart from each other, and your lips might have a particular curve — these attributes make up your “facial fingerprint.” By using mapping to identify these points as white dots, the researchers were able to apply just enough makeup to make a difference — so that perhaps your nose appears narrower, for instance.
A video shows the technique at work in a hallway, where a woman walks through with and without the makeup applied. When she walks through with a bare face, the ArcFace facial recognition model immediately identifies her as someone who’s been black-listed. But when she walks through again, with adversarial makeup applied, it does not.
Granted, this method takes a lot of work, and facial recognition technology is constantly getting better. Masks are still the best option, as more sophisticated facial recognition systems have shown themselves capable of seeing through trickery. And programs like Clearview AI work by comparing a photo against ones culled from social media, which may still return you as a match.
The state of surveillance — Facial recognition has been the subject of intense scrutiny because it’s been shown to misidentify Black people disproportionately to other demographics, which has directly led to false arrests. And its use at protest demonstrations is viewed as an intimidation tactic, as people might worry their every move is being monitored.
Companies including Amazon and convenience store chain Rite-Aid have responded by walking back some of their facial recognition programs, but Immigration and Customs Enforcement just announced a fresh $3.9 million deal to buy the technology and use it for “rapid alternatives to detention enrollments.” The contract states it will be used at immigration detention centers, though specifically how remains unclear.
Facial recognition has been lauded as an efficient way to catch criminals, but its unregulated use is concerning. Perhaps it’s time to invest in some makeup.