When even Facebook is concerned...

…you might be doing something against your users’ interest. Here’s Facebook’s Will Cathcart, Facebook’s head of WhatsApp:

“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.

Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy?”

Remember the promotion, “What happens on your iPhone, stays on your iPhone.”? That lasted a grand total of two years. And yes, Facebook has a vested interest in slagging Apple. But it’s not alone — the condemnation of Apple has been universal in the privacy space:

“We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

Here’s a good comment on Twitter:

This isn’t really a “slippery slope” — it’s a single heavily greased step. You need one order with a gag attached saying “you’re required to add this list of hashes” & your carefully crafted child protection system becomes an all-purpose population-scale search tool.

Hopefully Apple reverses its plan. If not, a quarter of the world’s phones are about to get a lot less secure.