At the WWDC event livestreamed on June 7, Apple spoke about a new feature that is coming to the iPhone, iPad, and Mac this year.
What Visual Look Up is soon going to do is recognize a variety of three-dimensional elements captured in your photos, and allow you to look up information on them by pressing the little interactive pop-up that will appear on top. According to Apple, it will easily help you classify things like the breed of a dog, the genus of flower, the name and geographical location of a particular landmark, and so on.
The coverage of Visual Look Up on Apple’s news release, following WWDC, is a sparing summary as follows:
With Visual Look Up, users can learn more about popular art and landmarks around the world, plants and flowers found in nature, breeds of pets, and even find books.
The livestream showed about as much, with a single row of screenshots showing exactly what is listed.
How will it compare to Google Lens?
With all of that taken into account, Apple’s Visual Look Up is in hardly as an impressive or versatile stage of development as Google Lens, which came out in 2017 and has hugely improved since then.
Google Lens does identify dog breeds, and plants, and landmarks, too—but that’s only the beginning of its capabilities. Google has developed Lens to where it can scan and recognize three-dimensional shapes through the camera lens, and look up nearly any product or object by searching out similar photos on the web, telling you where you can buy it and for how much.
With Google Lens you can also translate text in real time, superimposed onto the screen in augmented reality as you scan your surroundings. Apple’s Live Text will also support translation, but it freezes the image and superimposes it there—rather than Lens’s AR style.
We do expect Apple is going to do something special with Visual Look Up in the future, even if it only has a limited set of functions just yet.