The mysterious Apple Glasses have had a rumored 2022 launch date for months, but Wednesday's Twitter leak offers some of the first concrete spec details we've seen regarding how Apple's wearable AR displays will be manufactured.
Displays expert Ross Young tweeted (opens in new tab) that he has 'heard from multiple sources that Apple is pursuing AR/VR glasses using Sony microOLEDs. 0.5", 1280x960 resolution, 1H'22 intro.'
By 'Sony microOLEDs,' Young is referring to Sony's OLED microdisplay (opens in new tab) technology, which Sony says is built for AR/VR glasses among other applications. It claims these OLED microdisplays have a 100,000:1 contrast ratio, a response speed of 0.01ms or less, and a wide color gamut.
- Apple AirPods Studio could integrate with Apple Glasses
- Apple Glasses could hide your iPhone's screen for privacy
- How iOS 14 quietly took a big step towards the Apple Glasses
I should clarify that this is AR only. It will use projection optics inside the glasses.October 22, 2020
Young says the half-an-inch (0.5in) display will have a pixels-per-inch count of over 3000. Considering glasses lenses are typically about two inches wide, this suggests that the OLED microdisplays will be embedded inside of larger lenses, which could limit the field of view of the AR HUD. Perhaps the OLED portion of the lens will be able to project its visual data across the entire lens surface.
While the leaker initially said that these displays would be used for AR/VR glasses, he then clarified that Apple planned to use them for its AR-only Glasses design. This tracks with rumors we've heard that AR Glasses would launch in 2022, followed by a VR/AR hybrid headset in 2023 or later.
Young previously predicted the 2021 iPhone 13 lineup would have integrated touch OLED screens with 120Hz ProMotion, which strongly suggests that he has inside sources at Apple or with its manufacturing partners.
(Via AppleInsider (opens in new tab))
Sighting other Apple Glasses leaks
Recent Apple patent filings have given us even further insight into some of the features and techniques that Apple Glasses could use.
One Apple patent (opens in new tab), 'Keyboard Operation With Head-mounted Device,' describes how AR Glasses could be used to project a 'virtual keyboard' onto a desk or 'touch-sensitive surface' and register your keystrokes using the Glasses. Found by AppleInsider (opens in new tab), this patent also proposed making your hands invisible while you type to better see the keys, or projecting typed words or 'suggested text' in the air above your keyboard.
Another patent (opens in new tab) also found by AppleInsider (opens in new tab), 'Monitoring a user of a head-wearable electronic device,' proposes to use Apple Glasses to monitor your head movements and facial expressions, then send the data to your iPhone. Along with obvious movements like shaking your head, Apple Glasses could use its light sensors to monitor 'chewing; smiling; frowning; grimacing; gasping; mouth opening; mouth closing; or humming.'
Finally, an Apple patent (opens in new tab) filed today and found by Patently Apple (opens in new tab) proposes that Glasses wearers can pick up two consumer products and hold them both within view. The Glasses will respond by searching for data on both products, then projecting a summarized report of how the two products compare to one another in your field of view.