At the unveiling of the new iPad Air, it was striking how little information Apple was prepared to reveal about the device’s revolutionary new A14 processor. It was assumed that the company was trying to hold back some highlights for the iPhone 12 launch event, since the new phone will also feature this chip.
There are still no publicly available test results for the new iPad Air – at least not official ones. But leaked tests of the new chip hint that the remarkable 40% performance boost promised by Apple is no exaggeration. Apple’s new 5nm chip is significantly faster than its predecessor.
The two high-performance cores of the 6-core system run at just under 3GHz, whereas the A13 chip in the iPhone 11 still runs at 2.66GHz. And Apple has also greatly improved graphics performance. According to the Geekbench database, the A14 scored 12,571 points in the Metal benchmark, surpassing the A12Z of the current iPad Pro (11,665).
But that’s obviously not all: as a conversation between Stern magazine and Tim Millet from Apple discussed, the A14’s new Neural Engine could be its most important innovation. It it claimed to be able to speed up some tasks by a factor of two – and in some cases, by a factor of ten.
On the circuit board, this specialised computing unit with 16 cores seems to take up the same space as the CPU cores and can perform 11 trillion arithmetic tasks per second – whereas the computing unit of the iPhone X is capable of ‘only’ 600 billion. In addition, the A14 chip contains specialised blocks for machine learning. Core ML, the interface for these units, offers an app developer access to frameworks such as vision (image analysis), natural language (text analysis), speech (audio-to-text) and sound analysis.
According to Millet, the high performance that’s made possible by 5nm technology is not yet a revolution. It’s more important, he says, that Apple makes the high performance of the new technology easy to use for thousands of developers.
As was shown during the keynote, all of this enables completely new functions for an app like Pixelmator Pro. The tool can perform rapid and high-quality scaling functions: as an example, it was shown how a greatly enlarged area of an image can be scaled up and smoothed in seconds – a task that would otherwise require desktop software like Photolemur.
Sound analysis is also much quicker with the new hardware, something that was demonstrated in the keynote using software from Algoriddim. Its djay software offers a new AI-based Neural Mix function and can bring two different songs to the same key and rhythm: if an iPad or iPhone has a Neural Engine, the tool can even filter out individual tracks in real time.
So we are starting to get a sense of the A14’s power and potential. But no doubt we will learn more when Tim Cook takes to the virtual stage on 13 October.
This article originally appeared on Macwelt. Translation by David Price. Main image courtesy of Twitter user Mr White.