In the fall of 2023, the Ray-Ban Meta Smart Glasses, once known as Ray-Ban Stories, got a big upgrade. They now have a faster processor, better cameras, improved audio, and you can even livestream to Facebook and Instagram. Oh, and Meta AI is now on board. The recent Version 2 update stepped up the game with better image quality, global volume control, and beefed-up security features. And guess what? More updates are on the way.Meta’s CTO, Andrew Bosworth, shared in a Threads post (via Engadget) that a new feature is on the way with the latest beta. It is a handy tool that identifies landmarks in different places and provides additional information about them, essentially serving as a virtual tour guide for travelers.
Bosworth flaunted a couple of sample pics, breaking down why the Golden Gate Bridge rocks an orange hue (apparently, it’s easier to spot in fog), dished out some trivia about the iconic “painted ladies” houses, and shed light on the Coit Tower in San Francisco. Below these snapshots, descriptions popped up, adding extra context to the visuals.Meanwhile, Mark Zuckerberg took to Instagram to flex the glasses’ new feature with a bunch of videos shot in Montana. This round, the glasses switched gears, using audio to dish out a verbal rundown of Big Sky Mountain and the backstory behind the Roosevelt Arch. Oh, and Zuckerberg threw in a quirky request, asking Meta AI to break down how snow forms in a primitive, caveman-style manner.
Meta gave a sneak peek of this feature at its Connect event last year, rolling out some “multimodal” tricks that let the glasses tackle questions based on your surroundings. And guess what? It is all thanks to Meta’s smart glasses tapping into real-time info, with a boost from Bing Search in the mix.
This feature is like Meta’s own version of Google Lens, letting users “show” stuff they see through the glasses and fire off questions to the AI. For the moment, it’s exclusive to those in Meta’s early access program, but it will roll out to more users in time.