Apple Glasses leak hints at half-inch Sony OLED microdisplay for 2022 launch

Apple Glasses leak
(Image credit: TechRadar)

The mysterious Apple Glasses have had a rumored 2022 launch date for months, but Wednesday's Twitter leak offers some of the first concrete spec details we've seen regarding how Apple's wearable AR displays will be manufactured.

Displays expert Ross Young tweeted that he has 'heard from multiple sources that Apple is pursuing AR/VR glasses using Sony microOLEDs. 0.5", 1280x960 resolution, 1H'22 intro.'

By 'Sony microOLEDs,' Young is referring to Sony's OLED microdisplay technology, which Sony says is built for AR/VR glasses among other applications. It claims these OLED microdisplays have a 100,000:1 contrast ratio, a response speed of 0.01ms or less, and a wide color gamut. 

See more

Young says the half-an-inch (0.5in) display will have a pixels-per-inch count of over 3000. Considering glasses lenses are typically about two inches wide, this suggests that the OLED microdisplays will be embedded inside of larger lenses, which could limit the field of view of the AR HUD. Perhaps the OLED portion of the lens will be able to project its visual data across the entire lens surface.

While the leaker initially said that these displays would be used for AR/VR glasses, he then clarified that Apple planned to use them for its AR-only Glasses design. This tracks with rumors we've heard that AR Glasses would launch in 2022, followed by a VR/AR hybrid headset in 2023 or later.

Young previously predicted the 2021 iPhone 13 lineup would have integrated touch OLED screens with 120Hz ProMotion, which strongly suggests that he has inside sources at Apple or with its manufacturing partners.

(Via AppleInsider)

Sighting other Apple Glasses leaks

Recent Apple patent filings have given us even further insight into some of the features and techniques that Apple Glasses could use.

One Apple patent, 'Keyboard Operation With Head-mounted Device,' describes how AR Glasses could be used to project a 'virtual keyboard' onto a desk or 'touch-sensitive surface' and register your keystrokes using the Glasses. Found by AppleInsider, this patent also proposed making your hands invisible while you type to better see the keys, or projecting typed words or 'suggested text' in the air above your keyboard.

Another patent also found by AppleInsider, 'Monitoring a user of a head-wearable electronic device,' proposes to use Apple Glasses to monitor your head movements and facial expressions, then send the data to your iPhone. Along with obvious movements like shaking your head, Apple Glasses could use its light sensors to monitor 'chewing; smiling; frowning; grimacing; gasping; mouth opening; mouth closing; or humming.'

Finally, an Apple patent filed today and found by Patently Apple proposes that Glasses wearers can pick up two consumer products and hold them both within view. The Glasses will respond by searching for data on both products, then projecting a summarized report of how the two products compare to one another in your field of view.

Michael Hicks

Michael Hicks began his freelance writing career with TechRadar in 2016, covering emerging tech like VR and self-driving cars. Nowadays, he works as a staff editor for Android Central, but still writes occasional TR reviews, how-tos and explainers on phones, tablets, smart home devices, and other tech.