Skip links
Apple’s Tim Cook, John Giannandrea and Craig Federighi on stage during WWDC in Cupertino, California, on 10 June 2024. Photograph: Justin Sullivan/Getty Images

How Apple plans to usher in ‘new privacy standards’ with its long-awaited AI


Company maintains its in-house AI is made with security in mind, but some professionals say ‘it remains to be seen’.

At its annual developers conference on Monday, Apple announced its long-awaited artificial intelligence system, Apple Intelligence, which will customize user experiences, automate tasks and – the CEO Tim Cook promised – will usher in a “new standard for privacy in AI”.

While Apple maintains its in-house AI is made with security in mind, its partnership with OpenAI has sparked plenty of criticism. OpenAI tool ChatGPT has long been the subject of privacy concerns. Launched in November 2022, it collected user data without explicit consent to train its models, and only began to allow users to opt out of such data collection in April 2023.

Apple says the ChatGPT partnership will only be used with explicit consent for isolated tasks such as email composition and other writing tools. But security professionals will be watching closely to see how this, and other concerns, will play out.

“Apple is saying a lot of the right things,” said Cliff Steinhauer, director of information security and engagement at the National Cybersecurity Alliance. “But it remains to be seen how it’s implemented.”

A latecomer to the generative AI race, Apple has lagged behind peers like Google, Microsoft and Amazon, which have seen shares boosted by investor confidence in AI ventures. Apple, meanwhile, held off from integrating generative AI into its flagship consumer products until now.

The company would have you believe the wait was intentional – as a means to “apply this technology in a responsible way”, Cook said at Monday’s event. While other companies pushed out products quickly, Apple has spent recent years building most of the Apple Intelligence offerings with its own technology and proprietary foundational models, ensuring as little user data as possible leaves the Apple ecosystem.

Artificial intelligence, which relies on collecting large amounts of data to train language learning models, represents a unique challenge to Apple’s longstanding privacy focus. Vocal critics like Elon Musk have argued that maintaining user privacy while integrating AI is impossible. Musk even said he would ban his employees from using Apple devices for work when the announced updates go through. But some experts disagree.

“With this announcement, Apple is paving the way for companies to balance data privacy and innovation,” said Gal Ringel, co-founder and CEO of data privacy software firm Mine. “The positive reception of this news, as opposed to other, recent AI product releases, shows that building up the value of privacy is a strategy that certainly pays off in today’s world.”

Many recent AI releases have ranged from dysfunctional and silly to downright dangerous – harkening back to Silicon Valley’s classic “move fast and break things” ethos. Apple appears to be taking an alternative approach, said Steinhauer.

“If you think about the concerns we have had about AI up to this point, it is that platforms are often releasing products and then fixing things as they pop up,” he said. “Apple is proactively addressing common concerns people have. It’s the difference between security by design and security after the fact, which will always be imperfect.”

At the core of Apple’s privacy assurances regarding AI is its new Private Cloud Compute technology. Apple seeks to do most computer processing to run Apple Intelligence features on devices. But for functions that require more processing than the device can handle, the company will outsource processing to the cloud while “protecting user data”, Apple executives said on Monday.skip past newsletter promotion

To accomplish this, Apple will only export data required to fulfill each request, create additional security measure around the data at each end point, and not store data indefinitely. Apple will also publish all tools and software related to the private cloud publicly for third-party verification, executives said.

Private Cloud Compute is “a noteworthy leap in AI privacy and security”, said Krishna Vishnubhotla, vice-president of product strategy at mobile security platform Zimperium – adding that the independent inspection component is particularly notable.

“In addition to fostering user trust, these innovations promote higher security standards for mobile devices and apps,” he said.

Betsy Header

… we have a small favour to ask. Tens of millions have placed their trust in the Guardian’s fearless journalism since we started publishing 200 years ago, turning to us in moments of crisis, uncertainty, solidarity and hope. More than 1.5 million supporters, from 180 countries, now power us financially – keeping us open to all, and fiercely independent. Will you make a difference and support us too?

Unlike many others, the Guardian has no shareholders and no billionaire owner. Just the determination and passion to deliver high-impact global reporting, always free from commercial or political influence. Reporting like this is vital for democracy, for fairness and to demand better from the powerful.

And we provide all this for free, for everyone to read. We do this because we believe in information equality. Greater numbers of people can keep track of the global events shaping our world, understand their impact on people and communities, and become inspired to take meaningful action. Millions can benefit from open access to quality, truthful news, regardless of their ability to pay for it.

Whether you give a little or a lot, your funding will power our reporting for the years to come. If you can, please support us on a monthly basis. It takes less than a minute to set up, and you can rest assured that you’re making a big impact every single month in support of open, independent journalism. Thank you.

Source theguardian.com

Leave a comment

This website uses cookies to improve your web experience.
Explore
Drag