Culture

Facial recognition AI probably used your selfies for training

If you have a Flickr account, your selfies might have helped enable our surveillance state.

Shutterstock

By now, we all have more than enough reasons to never post a selfie again. Aside from abetting a culture of digital narcissism and crippling, social media-derived dopamine addictions, it really just isn't the best thing for your online privacy. Recently, it's become more obvious that there's a whole other dimension of problems to the situation: most of the sites receiving the bulk of our photo uploads have really (and we mean really) lax content licenses.

Because of that, it's more than likely that at least some of your photos and selfies uploaded online over the years have been harvested by developers working on improving AI facial recognition projects. If the knowledge that your mug probably helped further our sketchy surveillance state-industrial complex troubles you, well, at least you're not alone. Enter: Exposed.ai

Shutterstock

The odds aren't in your favor — Developed by Adam Harvey and Jules LaPlace and based off the pair's previous project, Megapixels, Exposed.ai currently scours over 3.5 million pictures within six massive image datasets used by various facial recognition training programs to determine if any of your past photos have been employed in AI machine learning. Right now, Exposed.ai is only limited to scanning Flickr's files to see if pictures were used in "biometric image training or testing datasets," although the site mentions the hope for additional sources to be added in the future. Keep in mind that it is 3.5 million photos, not individual faces... so yeah. You've been warned.

Not much to do about past photos — Alas, since Exposed.ai is dealing with pre-existing datasets, that means nothing can change the fact that the selfie of you wearing your Taking Back Sunday t-shirt circa 2006 has already helped train AI systems to better identify private citizens. The makers of Exposed.ai explain in their FAQ that some datasets' makers provide means for requesting your images be removed from future AI testing, and the site is working to include all the relevant information and steps to do so soon. But as of right now, you're kind of on your own.

And if egregious abuse of online privacy loopholes wasn't enough to enrage you, it doesn't seem like those millions of faces have done much to change the fact that many facial recognition AI programs remain racist as hell.