20.5 C
Melbourne
Sunday, November 17, 2024

Trending Talks

spot_img

Apple’s leap into generative AI makes for a privacy balancing act

[ad_1]

“Private Cloud Compute allows Apple Intelligence to process complex user requests with groundbreaking privacy,” said Apple software engineering boss Craig Federighi at the event.

“We’ve extended iPhone’s industry-leading security to the cloud, with what we believe is the most advanced security architecture ever deployed for cloud AI at scale.”

I think it’s fair to say that sending any personal data from your device, over the internet to a remote server, is inherently less secure than having the data never leave the device. So we can’t simply take Apple’s word that its solution is more secure than others (after all, other web giants like Google and Amazon spend billions on secure cloud storage too), and Federighi’s inference that Private Cloud Compute is as secure as local iPhone data storage deserves scrutiny.

That said, Apple’s setup has a number of points in its favour to separate it from other companies involved in cloud AI processing.

For starters, it designs and controls both the device the data originates from, and the server itself. For example, when an iPhone 15 needs to contact a secure server to crunch a particular AI request, both systems will be run by Apple Silicon with features including Secure Enclave (which protects encryption keys so data can’t be intercepted) and Secure Boot (which makes sure a verified operating system is running). The devices will also be able to verify each other as secure, before data is exchanged.

But even if data is difficult to intercept, what about malicious actors hacking in and stealing stored data? No server is truly safe from that. But as Apple tells it, there won’t really be any data of value to steal. The user’s device sends only the data strictly needed to provide the answer to the request, it’s only accessible to that specific server, and it’s never stored so it disappears once the task is complete.

Apple devices will ask before sharing personal data with OpenAI.

Apple devices will ask before sharing personal data with OpenAI.

Apple has also said independent experts are welcome to examine the code it uses to run its secure servers, to verify that they’re doing what it’s claimed they’re doing. You could view the Private Cloud Compute system as a moving of the goal posts for Apple from “we won’t take your data from your device” to “we will but we won’t look at it”. But then some of the AI tasks we’re talking about just aren’t possible on a phone, so if you’re going to send it to someone it might as well be the company promising not to even peek at it.

Where the most questions arise in terms of Apple Intelligence and privacy is with regard to third-party integrations. Apple devices will be able to decide whether a different AI service would be best to accomplish a request, and at WWDC the first of these services was revealed to be OpenAI’s ChatGPT. This means that iPhone, iPad and Mac users will get free access to some of the most powerful large language models around to help them rewrite text or summon custom images, but it also means sending personal data outside the Apple ecosystem.

To be clear, this is no more a privacy risk than downloading the ChatGPT app from the App Store and using it. But since the technology is going to be a core and integrated part of the operating system on Apple products, it’s worth noting that OpenAI does not have the same track record of privacy and security commitments that Apple has.

The company has been criticised for its data collection practices, both the ways it’s obtained information to train its models and the way it implements data collected from users. Apple’s challenge is to provide its users with the utility of ChatGPT, without selling them out to OpenAI. At the WWDC keynote Apple obviously did not say that using ChatGPT on iPhone was less secure than sticking to its own AI capabilities, but there was a tacit admission along those lines; the iPhone will ask your permission every time before it shares information with ChatGPT.

Loading

In a news release Apple said that it obscures your IP address before sending the information, and that OpenAI “won’t store requests”. But it remains to be seen whether OpenAI stores any data collected from these Apple Intelligence requests. Even without IP addresses or the ability to identify individual users, the anonymised and aggregated data could be very useful for OpenAI in training its models.

Apple also mentioned that users will have the option to log in with their ChatGPT account, at which point they’ll no longer be anonymised by Apple and will subject to OpenAI’s privacy policy. It’s not clear at this point what benefit there is to doing that, but it’s entirely possible OpenAI will announce some iPhone-specific features that only paying customers get, in order to encourage more sign-ups.

Ultimately I expect Apple will give users the ability to turn off ChatGPT if they like, and in the future it will likely offer multiple AI providers the same way it offers multiple search engines now. As for cloud processing in general, I’d like to believe you’ll be able to turn it off, but the company has hit the Private Cloud Compute talking point so hard it’s not a guarantee.

Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.

[ad_2]

Source link

Serendib News
Serendib News
Serendib News is a renowned multicultural web portal with a 17-year commitment to providing free, diverse, and multilingual print newspapers, featuring over 1000 published stories that cater to multicultural communities.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles