iPhone owners skeptical of AI’s reliability may be happy to learn that Apple’s technology will likely work on the device rather than through the cloud. At first, anyway.
Bloomberg reporter Mark Gurman made the revelation in the question-and-answer portion of this week’s Power On newsletter. “As the world awaits Apple’s big AI reveal [at WWDC] on June 10, it looks like the first wave of features will work entirely on the device,” he said (via MacRumors). “That means there is no cloud computing component in the company’s big language model, the software that powers new features.”
The iPhone may, however, benefit from some cloud-based AI features powered by technology from other companies: possibly Google’s Gemini or Baidu’s ERNIE Bot.
However, this seems to be good news. It’s easy to see why tech companies like to offload the computing load from mobile devices and to remote cloud server farms, as this reduces battery drain and potential overheating and increases the processing power that can be used for tasks. But for the user, this introduces an additional point of failure: the quality of their internet connection. (Actually, it introduces two, because Apple’s servers could also go down. But that’s less of a concern.)
Siri, for example, was known in the past for needing a connection to complete mundane tasks. The author of this news article has often asked a HomePod to play a particular song, only to have Siri mishear and play something else, then refuse to turn it off despite repeated requests to “stop playing”, because he can’t connect. to interpret these words. This can be extremely frustrating.
Starting with iOS 15 in 2021, Apple wisely started moving some of Siri’s most basic and frequently used functions to the iPhone, and these days it can set timers, create alarms, change system settings and much more offline. If this also applies to the majority of iOS 18’s AI features, we’d consider that a win.
For all the new features coming to Apple devices this year, check out our regularly updated WWDC 2024 guide.