HomeTech AI comes to your Apple devices. Will it be safe?

AI comes to your Apple devices. Will it be safe?

0 comment
 AI comes to your Apple devices. Will it be safe?

At its annual developer conference on Monday, Apple announced its long-awaited artificial intelligence system, Apple Intelligence, which will personalize user experiences, automate tasks and, as CEO Tim Cook promised, usher in a “new standard of privacy in AI.”

While Apple maintains that its internal AI is designed with security in mind, its partnership with OpenAI has drawn a lot of criticism. OpenAI ChatGPT Tool It has long been the issue of privacy concerns. Launched in November 2022, it collected user data without explicit consent to train its models, and only began to allow Users will be able to opt out of such data collection in April 2023.

technology/article/2024/jun/10/apple-ai-product-launch"},"ajaxUrl":"https://api.nextgen.guardianapps.co.uk","format":{"display":0,"theme":0,"design":0}}" config="{"renderingTarget":"Web","darkModeAvailable":false,"updateLogoAdPartnerSwitch":true,"assetOrigin":"https://assets.guim.co.uk/"}"/>

Apple says the ChatGPT association will only be used with explicit consent for isolated tasks like email composing and other writing tools. But security professionals will be watching to see how this and other concerns develop.

“Apple is saying a lot of the right things,” said Cliff Steinhauer, director of information security and engagement at the National Cybersecurity Alliance. “But it remains to be seen how it is implemented.”

A latecomer to the generative AI race, Apple has lagged peers like Google, Microsoft and Amazon, whose shares have risen on investor confidence in AI companies. Apple, meanwhile, has so far refrained from integrating generative AI into its flagship consumer products.

The company wants to make it seem like the wait was intentional, as a means to “apply this technology responsibly,” Cook said at Monday’s event. While other companies launched products quickly, Apple has spent the last few years building the majority of Apple Intelligence offerings with its own technology and proprietary foundational models, ensuring that as little user data as possible leaves the Apple ecosystem.

Artificial intelligence, which relies on collecting large amounts of data to train language learning models, represents a unique challenge to Apple’s long-standing approach to privacy. Vocal critics like Elon Musk have discussed that maintaining user privacy while integrating AI is impossible. Musk even said that he would prohibit his employees from using Apple devices for work when the announced updates are released. But some experts disagree.

“With this announcement, Apple is paving the way for companies to balance data privacy and innovation,” said Gal Ringel, co-founder and CEO of the data privacy software firm. Mine. “The positive reception to this news, unlike other recent AI product launches, shows that increasing the value of privacy is a certainly worthwhile strategy in today’s world.”

skip past newsletter promotion
technology/article/2024/jun/11/apple-stock-reaches-record-high"},"ajaxUrl":"https://api.nextgen.guardianapps.co.uk","format":{"display":0,"theme":0,"design":0}}" config="{"renderingTarget":"Web","darkModeAvailable":false,"updateLogoAdPartnerSwitch":true,"assetOrigin":"https://assets.guim.co.uk/"}"/>

Many recent AI releases range from dysfunctional and stupid to downright dangerous, recalling the classic Silicon Valley spirit of “move fast and break things.” Apple appears to be taking an alternative approach, Steinhauer said.

“If you think about the concerns we’ve had about AI up to this point, it’s that platforms often release products and then fix things as they appear,” he said. “Apple is proactively addressing common concerns people have. It is the difference between security by design and security after the fact, which will always be imperfect.”

At the heart of Apple’s privacy assurances regarding AI is its new Private Cloud Compute technology. Apple seeks to do most of the computing processing to run Apple Intelligence functions on devices. But for features that require more processing than the device can handle, the company will outsource processing to the cloud while “protecting user data,” Apple executives said Monday.

technology/article/2024/jun/11/apple-push-into-ai-could-spark-smartphone-upgrade-supercycle"},"ajaxUrl":"https://api.nextgen.guardianapps.co.uk","format":{"display":0,"theme":0,"design":0}}" config="{"renderingTarget":"Web","darkModeAvailable":false,"updateLogoAdPartnerSwitch":true,"assetOrigin":"https://assets.guim.co.uk/"}"/>

To achieve this, Apple will only export the data necessary to fulfill each request, create additional security measures around the data at each endpoint, and will not store data indefinitely. Apple will also publish all tools and software related to the private cloud for third-party verification, executives said.

Private Cloud Compute is “a notable leap in AI privacy and security,” said Krishna Vishnubhotla, vice president of product strategy at the mobile security platform. Zimperio – adding that the independent inspection component especially stands out.

“In addition to building user trust, these innovations promote higher security standards for mobile devices and applications,” he said.

You may also like