Many of you out there are looking forward to the release of Apple’s next big thing, and it could be quite a doozie. Apple Glass is the long-rumored AR-powered smart glasses that the tech giant is supposedly working on. We’ve seen Apple file a plethora of patent applications for the device to safely reach the conclusion that this is something that Apple
is indeed working on. For example, one patent application filed by the company shows how, thanks to AR, landmarks and iconic buildings will be labeled on the Apple Glass display
The latest Apple Glass rumor includes the use of Sony optics for the smart glasses
Google Glass was first introduced to the world in a video that showed off the potential of the device as a replacement for the smartphone. But priced at $1,500 before smartphones crossed over the four-digit price tag line, Google’s device eventually turned into a production tool for the enterprise and health care. We expect that Apple Glass will try to recapture the excitement that Google generated back in April 2012 when it positioned Google Glass as a smartphone UI
floating in front of the user’s face at all times.
Sony rumored to deliver important components for Apple Glass
Last week, the founder of Display Search, Ross Young, disseminated a tweet
that read, “We have heard from multiple sources that Apple is pursuing AR/VR glasses using Sony microOLEDs. 0.5″, 1280×960 resolution, 1H’22 intro. Thoughts?” Young followed that tweet up with another one that pointed out that the Sony component would be used for AR only. “It will use projection optics inside the glasses,” Young added. Sony does have experience in producing small displays for headsets such as the PlayStation VR and the personal 3D viewer. The latter device uses two smaller OLED panels cloer to what would be needed for Apple Glass; the former is equipped with a single 5.7-inch OLED display.
Earlier this year Twitter tipster Jon Prosser tweeted out several leaks about the smart glasses including one saying that the device will have a LiDAR sensor on the right temple. The sensor uses Time-of-Flight to measure how long it takes for infrared light to bounce off of a subject and return to the sensor. With that information providing more accurate depth information, more accurate AR capabilities are possible.
Similar to the way that the early Apple Watch
versions used the device owner’s own iPhone to handle processing tasks, we should see the same thing with Apple Glass when the device is first released to the public, which Young says will take place during the first half of 2022. Eventually, Apple hopes to be able to design the components needed to make Apple Glass self-sufficient in terms of processing power although that might take some time.
There is speculation that outside of the LiDAR sensor, Apple will not add any additional cameras to Glass. That’s because Google Glass owners were criticized for using the camera to take photos without the subject even knowing that he/she was being targeted for a snapshot. Several bars banned patrons wearing Google Glass for this very reason. And thus the nickname Glassholes was bestowed upon Google Glass wearers. Apple is sensitive to the potential lack of privacy that users might have to deal with. The rumored Starboard operating system will allow someone wearing Apple Glass to navigate around the screen by using gestures picked up by the LiDAR scanner or sensors built into the frames. The LiDAR scanner is also expected to allow Glass wearers to read QR codes. Apple reportedly is using proprietary codes to test the product.
Prosser previously leaked a price of $499 not including prescription lenses. A version of Apple Glass with tinted lenses will not be available right away because Apple hasn’t been able to get the displays to work on tinted lenses.