- This year it was all about the addition of artificial intelligence into every aspect of every OS
- But the latest features will only run on the latest hardware
The good news? Apple kicked off their annual WWDC developer event by rolling out exactly the keynote that everyone expected. The bad news? It’s time to get a new iPhone…
Being a software-based event, Apple’s kick off keynote saw the topline features of the new iOS 18, iPadOS 18, Watch OS 11, MacOS 15 Sequoia, and even vision OS 2 being shown for the first time and – needless to say – they all work together seamlessly for a uniquely Apple hardware+software experience.
There were neat features aplenty, but after showing off a cavalcade of “that’s clever” and “now why didn’t it do that before”s, the pre-recorded presentation (from inside, around, and above the Steve Jobs theatre at Apple Park) settled into the long-haul main groove of the deck – the introduction and implementation of a central AI tech across all their operating systems that they’re calling Apple Intelligence.
And while listing every nuance would be far too lengthy to explain here, there were a number of key “did they just say that?” moments.
It’s on-device… If you’re on the device
First of all there’s the fact that much of Apple Intelligence’s heavy lifting will be done on-device. That is to say, privately (of course), securely (likewise) and above all, quickly and without the need for a connection. Such a move instantly puts them ahead of the competition as even if you don’t care about your privacy (and many don’t) the damn thing working and giving the right answer in record time certainly sounds good to us.
Yes, Apple’s overpowered hardware strategy seems all set to pay off. After spending years developing their own silicon and stuffing “AI-cores” into hardware that doesn’t need them, running software that doesn’t touch them, Apple’s market leading chips are – via their own new OS – about to get a thorough workout.
And that’s the MAJOR stumbling block of Apple’s plan. Apple Intelligence requires either an M series chip (powering the last three generations of Mac and iPad) or an Apple A17 Pro processor – a chip which is only found Apple’s most recent, 2023 iPhone 15 Pro phones.
Even if you bought a 2023’s iPhone 15 which has been on sale for a mere seven months (or anything prior to that) then you’re out of luck.
Yes, even if you bought a 2023’s iPhone 15 which has been on sale for a mere seven months (or anything prior to that) then you’re out of luck. No Apple Intelligence for you. Meaning that iOS 18 is little more than the ability to arrange icons with gaps in between and a change of colours.
It’s less of a problem for Mac and iPad, which share Apple Intelligence integration into their software. By now we’re used to Apple’s own M series chips (replacing a long run of Intel processors that had held speed and battery life back and made true, cross-device uses impossible), and odds are that your iPad is already packing one too. Only last month Apple released their latest iPad Pro with the first airing of their M4 chip, creating a machine that – at the time – seemed preposterously overstuffed. Now that power is making sense.
How does Apple Intelligence work?
Simply put it’s a 100% exclusive, Apple-derived large language model able to run on-device. And while it must have been tempting to kick the Siri brand to the curb given her awful reputation for ruining your day, instead her familiar, soothing voice lives on, with Apple Intelligence simply providing a brain transplant in the background.
It’s to be hoped therefore that simple tasks such as “turn up the volume”, “call Rita”, and “remind me to get eggs” will finally elicit instant and accurate replies.
But Apple Intelligence can go much further than that. By studying your data on-device (that’s your messages, mails and even interactions with third party apps) Apple Intelligence can build a picture of what’s going on in your life. So suddenly vague and complex questions such as “Where am I meeting Phil for lunch?” and “Add the song that Julie recommended to my playlist” become possible.
How many times have you tried to remember when and in which app you got that vital piece of information? Now Siri knows it all.
And if the on-device brain gets stumped or doesn’t know enough about your world, Apple Intelligence can, if it needs to, go online to Apple’s own “powered by Apple Silicon” Apple Intelligence remote servers to take on more complex questions and generative requests when the on-device part can’t cut it. And, of course, because this is Apple, these servers are entirely private and don’t record or store any of your interactions.

And this extra power, when needed, gives Apple Intelligence some amazing up-with-the-best generative features. For instance a new Genmoji feature can create emoji-style images in Messages from what you type. Need to send a “shark eating the moon”? You got it.

Elsewhere, in Notes, a demo showed a rough sketch of an Indian temple being simply ringed and highlighted then transformed into a perfect artistic rendering (in your choice of three styles) and – most intriguing – even if you’ve no ideas whatsoever, a blank space can be highlighted and the surrounding content used for inspiration in the creation of another matching and perfectly in context drawing. Content creators pack your bags…

All this and ChatGPT too
And if that all wasn’t enough, Apple Intelligence can even make a call to ChatGPT and pull out the generative big guns. Yes, rather than compete with their own tech and always come second Apple, have cosied with their frenemies Microsoft and put ChatGPT deep under Siri and iOS’s skin.
If a task requires ChatGPT’s help (and at this point it’s not clear for what tasks and how often this might happen) iOS 18 will prompt the user that it’s drafting in ChatGPT’s help to get the job done, requiring a user to agree to a dropdown and thereby recognise the fact that their request is leaving Apple’s secure world.

In the presentation we saw a user generating a children’s bedtime story within the stock Apple Notes app and the creation of complex images from text. All in just a few seconds.
Apple clearly feels that this represents the best of all worlds, offering the security and ease that they’re famous for, while drafting in the extra power of a rival should your task require it. And the promise is that such open relations will continue with other AI models (Google Assistant? Microsoft Copilot? xAI Grok?) being implemented inside iOS in the same way.
It all adds up to a clever, seamless, and distinctively Apple way to surf the AI wave. And just as their 1984 roll-out of the first Apple Macintosh described the first, easy to use, graphical user interface computer as being “the computer for the rest of us” so Apple Intelligence built in at OS level was described as being “AI for the rest of us”.
However, the fact that only those of us with the very latest and top end Pro phones will be able to enjoy all of Apple Intelligence’s new features, will doubtless rankle for many. A bumper iPhone 16 launch later this year is starting to look like a certainty.