Loophole

Facebook is looking for ways to pull targetable data from your encrypted texts

Not just them, either. Amazon, Google, and Microsoft are also named in a recent report.

Ransomware,Malware,Encrypt and Hacking Conceptual with Padlock.The Old padlock and Chains On Laptop ...
Shutterstock

Facebook confirmed to The Information earlier this morning that it is developing a team including former Microsoft AI researchers and cryptographers to test ways the company could collect data from encrypted sources like WhatsApp without technically decrypting the information.

According to experts, the new project could allow Mark Zuckerberg and team to develop targeted ads derived from WhatsApp’s encrypted messages, “or to encrypt the data it collects on billions of users without hurting its ad-targeting capabilities.” Facebook — along with Big Tech giants like Amazon, Microsoft, and Google — are all looking into something called homomorphic encryption, which hypothetically would allow companies to access info like “financial records and medical data” while technically leaving said private files encrypted within cloud computing storage. Or something. It sounds pretty fucked up. Not to mention ironic.

Although Facebook is touting homomorphic encryption as a means to keep bad actors away and minimize cybersecurity breaches, it still is super creepy to have multiple major companies working on legal and ethical loopholes to better improve their already invasive targeted advertising ecosystems. There’s really no telling where this road will lead us, although experts note that it could be years until we actually see the technology implemented within wider populations. A small comfort, if nothing else.

This has been in the works for a while — Although Facebook is relatively new to the research, it turns out companies like Microsoft, Google, and IBM have been looking into homomorphic encryption’s usages for over a decade, ever since a researcher at IBM named Craig Gentry published a paper explaining how such a feat could be hypothetically achieved. As of right now, the new tech is still incredibly expensive and energy-consuming, and is really only utilized in medical instances such as using X-ray images to determine if hospital patients have COVID-19, as IBM has done. Researchers also describe homomorphic encryption combined with machine learning to identify rare diseases using patient records and locations. Can we just restrict it to those areas, maybe?

Legally nebulous — Of course, this obviously begs the questions: Are Facebook’s goals even legal? Can you call something truly encrypted if certain data can still be gleaned from it? And, last but not least, what the absolute hell? Unfortunately, our (admittedly amateur) initial read is that it’s probably totally within Facebook’s bounds to do this kind of thing. It technically owns the data, and as long as it technically isn’t looking at our personal info itself, it’s probably enough to be legally permissible. Which most definitely sucks.