The latest Apple Glass rumor includes the use of Sony optics for the smart glasses
Earlier this year Twitter tipster Jon Prosser tweeted out several leaks about the smart glasses including one saying that the device will have a LiDAR sensor on the right temple. The sensor uses Time-of-Flight to measure how long it takes for infrared light to bounce off of a subject and return to the sensor. With that information providing more accurate depth information, more accurate AR capabilities are possible.
There is speculation that outside of the LiDAR sensor, Apple will not add any additional cameras to Glass. That’s because Google Glass owners were criticized for using the camera to take photos without the subject even knowing that he/she was being targeted for a snapshot. Several bars banned patrons wearing Google Glass for this very reason. And thus the nickname Glassholes was bestowed upon Google Glass wearers. Apple is sensitive to the potential lack of privacy that users might have to deal with. The rumored Starboard operating system will allow someone wearing Apple Glass to navigate around the screen by using gestures picked up by the LiDAR scanner or sensors built into the frames. The LiDAR scanner is also expected to allow Glass wearers to read QR codes. Apple reportedly is using proprietary codes to test the product.
Prosser previously leaked a price of $499 not including prescription lenses. A version of Apple Glass with tinted lenses will not be available right away because Apple hasn’t been able to get the displays to work on tinted lenses.