I’ve been using WhatsApp since before it was a Meta property, and it’s probably the only current Meta app I’d keep. It’s a great instant messaging app that works on iPhone and Android and provides the same end-to-end encryption as competing chat apps like iMessage and Signal.
That encryption predates Facebook’s purchase of WhatsApp, and it’s a feature the company had to ensure it kept unaltered over the years, even if it meant Facebook (and now Meta) could not monetize WhatsApp’s massive user base.
Now that Meta has developed Meta AI, its version of ChatGPT, the company forced the AI on all its customers. All Meta apps, whether Facebook, Messenger, Instagram, or WhatsApp, now feature that multi-colored circle that says AI is in the room.
I’ve said more than once that I do not want that sort of invasive AI into WhatsApp. It’s one thing to choose to text ChatGPT or Perplexity and even Meta AI from WhatsApp on your own accord, and quite another to have the Meta AI menu invade your chat.
Meta added Meta AI to WhatsApp without breaking the app’s strong encryption. It’s also worth noting that Meta gave WhatsApp a new privacy feature that ensures Meta AI can’t be brought into a conversation if one of the parties doesn’t want it.
But now Meta wants to offer WhatsApp users even more advanced AI features that require data to be processed outside of WhatsApp. That is, Meta wants to extract chat data from chats for things like summarization and suggestions and process it in the cloud.
The company developed an encrypted way to do it so end-to-end encryption would not be broken. However, this paints a massive target on Meta’s back. There’s no question that hackers, including nation-states with massive resources, would want to try to break the new privacy features Meta developed for WhatsApp chats.
If you can’t tell, I’m already dying to turn this feature off as soon as it becomes available in WhatsApp.
The new security feature for processing AI requests based on encrypted WhatsApp chats is called Private Processing, per Wired. It’s similar to Apple’s Private Cloud Compute, which is meant to keep Apple Intelligence cloud-based features protected with the same encryption as on-device iPhone security.
There are a few big caveats here. Apple Intelligence (what’s available right now) is designed to work mostly on the iPhone, iPad, and Mac before Private Cloud Compute processing is needed. Also, one trusts Apple’s privacy claims more than Meta’s, though independent security researchers will surely inspect Private Processing in the future and prove Meta’s security claims are real. Meta can’t afford to mess this up.
According to the report, using Private Processing in WhatsApp will be an opt-in feature, meaning it won’t be the default behavior for Meta AI. It’s unclear what that opt-in process will look like, but Private Processing should launch in the coming weeks. Meta should explain everything in detail by then. Worst case, you can enable “Advanced Chat Privacy” for each WhatsApp chat and ensure Meta AI can’t be used in it.
Wired notes that Private Processing is supposed to be secure by design, with audits to follow:
Private Processing is built with special hardware that isolates sensitive data in a “Trusted Execution Environment,” a siloed, locked-down region of a processor. The system is built to process and retain data for the minimum amount of time possible and is designed to grind to a halt and send alerts if it detects any tampering or adjustments.
Still, extracting data from encrypted chats is the kind of feature that will invite attacks. That’s not to say that hackers won’t try to attack Apple’s own Private Cloud Compute infrastructure for the same purposes, because they will.
From a user perspective, I don’t want a third-party AI like Meta AI on my devices to access messages for summarization and composition purposes inside a chat app.
I do want an operating-system-level AI like Siri on the iPhone to have the ability to summarize encrypted chats and offer suggestions, because that’s a personal assistant that should get access to lots of user data to provide helpful assistance. The key difference here is that Siri would have to process that data on the device rather than in Apple’s cloud. Even here, where there will be incentives to use an OS-level AI, I’d want chat data not to be exported to servers in any fashion.
Meta AI can’t have the same luxury. Meta will never get the same device access on iPhone as Siri, so it will have to offer AI features inside its apps. Apparently, some people want to use features like summarization and AI text generation inside WhatsApp, and Meta thinks it should create tools like Private Processing to offer them.