Home Tech Apple Intelligence promises better AI privacy. Here’s how it actually works

Apple Intelligence promises better AI privacy. Here’s how it actually works

0 comments
Apple Intelligence promises better AI privacy. Here's how it actually works

Apple is making all production PCC server builds publicly available for inspection, so that people unaffiliated with Apple can verify that PCC is (and isn’t) doing what the company claims, and that everything is deployed correctly. All PCC server images are recorded in a cryptographic attestation log—essentially an indelible record of signed assertions—and each entry includes a URL to download that individual build. PCC is designed so that Apple can’t put a server into production without registering it. And in addition to offering transparency, the system functions as a crucial enforcement mechanism to prevent malicious actors from setting up unauthorized PCC nodes and diverting traffic. If a server build hasn’t been registered, iPhones won’t send it any Apple Intelligence queries or data.

PCC is part of Apple’s bug bounty program, and vulnerabilities or misconfigurations found by researchers could be eligible for cash rewards. However, Apple says that since the iOS 18.1 beta became available in late July, no one has found any flaws in PCC so far. The company acknowledges that it has only made the tools to assess PCC available to a select group of researchers so far.

Several security researchers and cryptographers told WIRED that Private Cloud Compute looks promising, but they haven’t spent much time researching it yet.

“Building Apple Silicon servers in the data center when we didn’t have any before, building a custom operating system to run in the data center was a massive undertaking,” Federighi says. He adds that “building the trust model where your device will refuse to send a request to a server unless the signature of all the software running the server has been published in a transparency ledger was certainly one of the most unique elements of the solution, and absolutely fundamental to the trust model.”

In response to questions about Apple’s partnership with OpenAI and the ChatGPT integration, the company emphasizes that the partnerships are not covered by PCC and operate separately. ChatGPT and other integrations are disabled by default and must be manually enabled by users. Then, if Apple Intelligence determines that a request would be better fulfilled by ChatGPT or another partner platform, it notifies the user each time and asks if they want to proceed. Additionally, people can use these integrations while logged into their account for a partner service like ChatGPT, or they can use them through Apple without signing in separately. Apple said in June that it is also working on another integration with Google’s Gemini.

Apple said this week that in addition to launching in U.S. English, Apple Intelligence will be coming to Australia, Canada, New Zealand, South Africa and the U.K. in December. The company also said support for other languages, including Chinese, French, Japanese and Spanish, will be discontinued next year. Whether that means Apple Intelligence will be allowed under the European Union’s Artificial Intelligence Law and whether Apple will be able to offer PCC in its current form in China is another question.

“Our goal is to do our best to deliver the best capabilities to our customers wherever we can,” Federighi said. “But we have to comply with regulations and there are uncertainties in certain environments that we are trying to resolve so that we can deliver these capabilities to our customers as quickly as possible. So we are trying.”

He adds that as the company expands its ability to perform more Apple Intelligence calculations on the device, it could use this as a workaround in some markets.

Those with access to Apple Intelligence will be able to do much more than they could with previous versions of iOS, from writing tools to photo analysis. Federighi says his family celebrated their dog’s recent birthday with an Apple Intelligence-generated GenMoji (spotted and confirmed to be very cute by WIRED). But while Apple’s AI is meant to be as useful and invisible as possible, there’s a lot at stake for the security of the infrastructure that underpins it. So how are things going so far? Federighi sums it up without hesitation: “The launch of Private Cloud Compute has been delightfully smooth.”

You may also like