In a recent ZDNET article, my friend and colleague David Gewirtz explained why he considers the upcoming iPhone 16, with its focus on iOS 18 and Apple Intelligence, an essential upgrade.
While I value David’s perspective, I beg to differ.
Also: Apple Intelligence arrives in iOS 18.1 developer beta. Here’s what’s new for iPhone
David argues that the incorporation of artificial intelligence (AI) in iOS 18 makes the iPhone 16 a necessary upgrade for him, emphasizing the potential of Apple Intelligence to revolutionize how we interact with our devices. While I agree with his view in the long term, I’m not convinced that the first version of Apple Intelligence will represent the big leap forward in usability that so many are anticipating.
Every year, my wife and I eagerly await the release of the new iPhone. Being part of Apple’s Upgrade Program, we return our devices, reset our loan with Citizens Bank, and acquire the latest model. Over the past few years, I have opted for the Pro Max<!–>, and my wife has chosen the base model–>. The expected annual improvements have been incremental but appreciated.
<!–>
Despite the buzz around the iPhone 16’s new features and the integration of Apple Intelligence, however, several concerns dampen my enthusiasm for upgrading.
What Apple isn’t telling us about Apple Intelligence
Apple Intelligence represents a significant leap in on-device AI capabilities, bringing advanced machine learning and natural language processing directly to our phones. Unfortunately, this technology is still in its infancy. On-device LLMs and generative AI are essentially in an alpha or beta phase, and there’s a lot of uncertainty about how well they will perform on current Apple mobile hardware.
David views the integration of AI in iOS 18 as a significant leap forward, but let’s not kid ourselves. These on-device AI features are in their infancy, which means they might not deliver the seamless experience that Apple users have come to expect. When Apple launches Apple Intelligence to the public in the fall of 2024, it will still be a beta, not a finished product.
Apple Intelligence is not simply another random or routine iOS or even MacOS feature upgrade. The devices will load a downsized version of Apple’s Foundation Models, a home-grown large language model (LLM), which will be several gigabytes in size and have as many as 3 billion parameters. (Compare that to the hundreds of billions of parameters used by models like GPT-3.5 and GPT-4 – or what Apple will run in its data centers for its “Private Cloud Compute” feature of Apple Intelligence.)
Also: Apple Intelligence will improve Siri in 2024, but don’t expect most updates until 2025
Apple has not yet fully detailed to developers how this will work on iOS, iPadOS, and MacOS, but the model will have to be loaded – at least partially – in memory, potentially occupying between 750MB and 2GB of RAM when running, according to current estimates, depending on how good Apple’s memory compression technology is and other factors.
That’s a substantial amount of memory allocated to a core OS function that won’t always be used. As a result, parts of it will have to be dynamically loaded in and out of memory as needed, adding new system constraints for applications and potentially putting additional stress on the CPU.
The current iPhone hardware doesn’t cut it
Earlier this month, I discussed how older – as well as current generation – iOS devices aren’t powerful enough to handle on-device Generative AI tasks. The base iPhone 15, which has only 6GB of RAM, may struggle to meet the demands of Apple Intelligence as it evolves and becomes more integrated into iOS, core Apple applications, and developer applications. Older iPhones have 6GB of RAM or less.
The iPhone 15 Pro, with 8GB of RAM, may be better suited for these tasks. It is the only iOS device developers can use to test Apple Intelligence (besides their Macs and iPad Pros) before the iPhone 16 ships, presumably in October. That said, many users may still experience suboptimal performance on an 8GB device when Apple Intelligence is fully implemented.
Also: iOS 18.1 update: Every iPhone model that will support Apple’s new AI features (for now)
Early adopters may find the AI features more useful for developers than everyday users, as the system may need fine-tuning to reach its full potential. I also expect that, like the base iPhone 15 and earlier iPhone owners who won’t have access to it when they upgrade to iOS 18, Apple Intelligence will be a feature that users can simply turn off, saving their memory for application use.
The upcoming iPhone 16, despite possibly having more advanced hardware, may also struggle with the new AI capabilities due to design cycles that did not account for these features. It may take another product cycle or two before the hardware fully aligns with the new AI capabilities to be rolled out in iOS 18 and beyond. As a result, users may experience suboptimal performance and a less seamless user experience.
Why you shouldn’t buy the iPhone 16 for Apple Intelligence
For these reasons, I see the iPhone 16 (and potentially even the iPhone 17) as a transitional product in Apple’s journey to on-device AI.
In addition to other silicon optimizations, future iPhones will likely require more RAM to fully support these AI features, which could lead to increased costs. If the base iPhone 16 needs 8GB of RAM to run Apple Intelligence effectively, the starting price could be pushed to $899 or higher. The Pro models might require 12GB or even 16GB of RAM, further increasing the price. This would also mean a new A18 chip for the Pro models, while the base iPhone 16 might only get the current A17 – although perhaps Apple might build an “A17X” with 10GB of RAM to give the phone more memory breathing room.
Also: How iOS 18 changes the way you charge your iPhone
Besides memory concerns, AI processing demands a lot of power and additional computing resources. Without significant advancements in battery and power management technology, users might have to charge their phones more often. This can lead to increased battery drain, reduced battery lifespan, and potential performance issues. The extra processing power needed to run on-device LLMs could strain the CPU, causing the device to heat up and affecting its overall performance and reliability.
How Apple Intelligence will likely evolve
Apple’s AI capabilities are expected to improve significantly in the coming years. By 2025, we may see more advanced and dependable integration of Apple Intelligence not only on mobile devices and Macs, but also on products like the Apple Watch, HomePod, Apple TV, and a consumer-oriented version of the Vision headset.
To extend Apple Intelligence to these less powerful devices, as the company is doing with its “Private Cloud Compute” by running secure Darwin-based servers in their data centers for more advanced LLM processing, Apple might leverage cloud-based resources for these less-powerful systems through fully developed data center capabilities and partnerships with companies like OpenAI or Google.
Also: To rescue the Vision Pro, Apple must do these 3 things
Alternatively, they could consider a distributed or “mesh” AI processing system, where idle devices within a household or enterprise can assist less powerful ones with LLM queries.
Apple could achieve this by equipping MacOS 15 Sequoia, iOS 18, and iPadOS 18 with Apple Intelligence and the on-device LLM as planned. Subsequent changes to iCloud, iOS, iPadOS, and MacOS could enable all devices to communicate their generative AI capabilities and idle processing state. This would allow them to act as proxies for each other’s Apple Intelligence requests.
Enterprises may also employ a mobile device management solution to facilitate access to on-device LLMs with business Macs. Additionally, iPhones or Macs could be used as proxies for Apple Watch or HomePod requests for mobile users. We may also see a more powerful Apple TV with more onboard memory and processing to act as an Apple Intelligence “hub” for every Apple device used in a household.
Imagine your iPhone using the unused processing power of your Mac or iPad, all equipped with on-device LLMs, to tackle complex AI tasks. This would increase the accessibility of AI features across Apple’s product range.
I’m still optimistic
Despite the hype around Apple Intelligence, there are many other reasons to consider upgrading to the iPhone 16, like improvements in camera quality, display, and overall performance. The iPhone 16 will likely feature better sensors, enhanced computational photography, and superior video capabilities. The display might also see improvements in brightness, color accuracy, and refresh rate, making it a better device for media consumption and gaming.
Also: Apple may be cooking something big with its new Game Mode. Here are 3 things we know
If, however, you’re considering the iPhone 16 solely for its AI capabilities – which are still evolving and unlikely to deliver the expected performance touted in the WWDC 2024 keynote – you might want to manage your expectations.
–>
Source: Robotics - zdnet.com