As far as the Apple website is concerned, plans to add automatic scanning for child sexual abuse material (CSAM) never existed at all. Luckily we have the Internet Archive to remind us of once was.
Apple’s dedicated Child Safety page, as it stands now, spends most of its real estate speaking to updates rolled out this week with iOS 15.2. Those features are tame by comparison: a parental controls feature that auto-blurs nudes sent to children and extra information about reporting instances of online sex abuse, for the most part.
The Internet Archive shows the page used to tell a very different story. As recently as December 10, the page included two sections dedicated to the CSAM software. Recent versions of the page left the exact timing of the update very much up in the air; now it appears Apple could be mulling over the idea of scrapping it altogether.
NeuralHash no more? — It’s very unclear whether or not Apple has totally killed off the photo-hashing idea. All we really know is that the website no longer mentions it in any way, shape, or form.
After the initial wave of criticism rolled in, Apple relented a bit on NeuralHash’s rollout. On September 3, an update was added to the top of the Child Safety page:
Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
In the intervening months, Apple has been tight-lipped about when we might see that feature made widely available. Dead silent, in fact. As if perhaps Apple decided the best course of action would be to pretend NeuralHash never even existed.
Doomed from the start — Internet privacy advocates took issue with NeuralHash immediately after its announcement. We’re not talking about small-time operations here, either — even Edward Snowden got involved with the conversation.
The main criticism of NeuralHash is that it’s an invasion of privacy. In the context of Apple’s broader corporate statements, which often applauds the company’s cutting-edge privacy features, such tech borders on hypocritical.
Activists have also worried greatly about the power afforded to law enforcement by NeuralHash; it’s already trained to flag specific content and automatically send reports to the cops. It doesn’t take much imagination to see the more dystopian extremes of such software.
So either Apple finally decided NeuralHash wasn’t worth the backlash or it’s just been put on the long-term back-burner. We’ll try our best to be optimistic.