I have a confession: Although I frequently review new Apple products, I don’t always do so buy them. Like many of you, I can’t afford to update every piece of Apple hardware every time the company overhauls one of its products. So I have to carefully gauge when my stuff has gotten too old and needs to be replaced with something shiny and new.
Of course, Apple would like us all to buy new products all the time. But the company must earn its sales at its expense. I could buy a new iPhone thanks to an improved camera or a new MacBook Air thanks to a new design and faster processor. I might bypass the latest Apple Watch because the new features just don’t matter to me.
As the heat from the iPhone’s huge growth spurt begins to die down and iPad and Mac sales fall from pandemic-induced highs, Apple is looking for reasons to sell new hardware. And now she may have found a big one in a somewhat unexpected place: AI.
AI models eat RAM
Artificial intelligence algorithms are, of course, software. Theoretically, all current Apple hardware should be capable of running elements of AI. For example, Apple has been integrating neural engines into its chips for years. And yet the rumored addition of major AI features to Apple’s platforms starting this fall could fuel a new wave of upgrades.
Indeed, when we discuss AI these days, we largely discuss large language models (LLMs), things like OpenAI’s Chat GPT and Google’s Gemini. Apple is reportedly creating its own LLM, intending to have it run natively on Apple devices rather than being outsourced to the cloud. This could significantly increase speed and improve privacy.
But here it is: LLMs really need memory. Google has banned Gemini Nano, a model likely quite similar to what Apple is planning for the iPhone, from all but the largest Google Pixel phones, apparently due to memory limitations.
The most RAM ever in an iPhone is 8GB of memory in the iPhone 15 Pro and Pro Max. Although iOS has generally been better at managing memory usage than Android, this is still a relatively small amount of RAM and appears to be the bare minimum capable of running an LLM on the type’s device. which Apple and Google are working on.
Since Apple is reportedly unveiling its AI efforts at WWDC in June, it can’t really show off iPhone features that don’t work on any current models. But it’s not unreasonable to assume that many of iOS AI’s features might be limited to the iPhone 15 Pro models, since those are the only ones with 8GB of memory. (A new line of iPhones in the fall would likely come with enough memory.)
Apple
And just like that, Apple’s AI announcements can provide a vast set of features to motivate potential buyers. Want to use Apple’s most impressive new AI features? Unless you just bought the highest-end iPhone, you’ll need to upgrade.
One step behind
On Mac, things will probably be a little easier. Macs are beefier than iPhones, and it’s likely that most Apple Silicon Macs will do well with an Apple-built LLM, although even there M1 Macs may lag behind a bit by compared to the M2 and M3 versions.
Still, I’m starting to think that the most compelling reason why someone who owns a silicon-based Apple Mac might need to upgrade will be the slow processing of AI models, which can require a lot of memory and hardware. many GPU cores. I’m a big fan of M1 Macs, including the low-cost M1 MacBook Air, but Apple’s next-generation AI features may make the M1 feel old.
Then there’s the Apple Watch. Its hardware has just been upgraded to support Siri on the device for the first time, which suggests it may be a while before it has enough oomph to support an LLM on the device. But the more I think about it, the more I realize that I would upgrade my Apple Watch in a heartbeat if I could access a better, more responsive voice assistant.
Apple has sold a ton of M1 Macs, but they may not be powerful enough (or at least not have enough memory) to handle the AI processing on the device.
What: Apple
Of course, it’s still Apple’s responsibility to deliver the AI features people want. One of Apple’s most consistent traits over the years is the company’s ability to take cutting-edge technology and integrate it into features that users actually like. Delivering an LLM and other AI capabilities will not be a panacea: they need to be integrated with features that people will actually want to use.
But if Apple manages to integrate AI into its operating systems in a way that makes them more engaging, and by a happy coincidence that requires faster processors and more memory, it will motivate a series of hardware upgrades . And that’s a good thing for Apple, because while operating system updates are free, the new iPhones absolutely aren’t.
I’m not excited about replacing my Apple hardware, but I’d rather do it because I’m motivated by an awesome AI-based feature rather than because I’m tired of the color of my laptop or the shape of my computer. iPhone.