Five ways Apple AR Glasses can avoid being awful

Apple Glass
(Image credit: Future)

Welcome to the liminal space between where we know the date, time, and place of an Apple event and have almost no confirmed information about its content. The most obvious response is to fill that gap with musings about the possibilities, especially those surrounding what we think will be our first glimpse of Apple’s AR Glasses.

To refresh your memory, many of us now believe Apple gave us a barely concealed hint in its March Event invite and an executive tweet that it’s bringing the AR wearables.

Apple Glass, as the rumor mill likes to call them, should lean heavily on Apple’s already significant augmented reality skills. Owners of modern iPhones and iPads have seen how they can while looking through the device’s cameras, see virtual sneakers that they can walk around, a full-scale augmented LEGO creation that they can play with, and virtually dissect a woe begotten frog.

Thanks to powerful sensors, on-board processing, and ARKit 2, the effects are impressively realistic and get even better when you pair the iPhone 13 Pro and iPad Pro’s Lidar scanners, which can create a 3D mesh of the environment and let your AR creations interact more naturally with the real world.

Apple’s ability to transfer all that AR hardware and software expertise is not in question. Its ability to squeeze it into eyeglasses is more of an open one.

It’s coming

We know that the possibility of Apple revealing its AR glasses still lives in the realm of maybe, but since the assumption is we’re seeing the first working concept and not the final product on March 8, there’s still time to offer some important product development advice.

Don’t look ridiculous

Apple’s legendary industrial design will be put to the test with Apple Glass. We’ve worn their unusual and initially rejected AirPods in our ears (now we love them in all forms), and the elegant Apple Watch on our wrist, but something on our face, under our eyebrows, and over our eyes, is something else entirely.

To make Apple Glass, even if Apple puts much of the processing responsibility on your iPhone, means finding aesthetically pleasing places to put cameras, sensors (maybe LiDAR), transparent screens, and batteries.

The last is where virtually every smart glass manufacturer has struggled. You can’t really hide the bulk of a battery, though most give it their best shot by stuffing them into the stems.

For Apple, I suggest they find a way to spread the battery load throughout the frames and hide camera lenses and sensors behind one-way-mirrored glass or plastic.

Don’t require cables, even to charge

I don’t want to see a single port on my Apple Glass. Obviously, the glasses will connect to your iPhone via Bluetooth, but we still have to charge them. I don’t even want to see visible, copper connectors on Apple Glass. Qi, MagSafe-based charging is what I want.

Don’t give us a narrow field of vision

In the Apple Event invite tweet Apple’s Global Head of Marketing Greg Joswiak shared, the video appears to be a demonstration of the new Apple Glass AR viewport. 

It looks immersive, but that might be misleading. So many previous AR glasses have offered a tiny field of view. The early Microsoft Holo Lens viewport looked like a 42-inch TV floating in front of you. Google Glass's viewport looked like a tiny TV set that you had to look up at to catch a glimpse.

I want Apple Glass to feel as immersive as that short video made it look. To do so, Apple must figure out how to marry tiny-screen projection or display technology with a design that wraps around your face instead of sitting in front of it. I mean, Apple Glasses should have the look of a cool set of modern shades, not old-school black-framed eyeglasses (not even the nicer Ray-Ban Stories Facebook is hawking).

Don’t skimp on resolution

Our eyes will be close to whatever screen technology Apple plans to use. Assuming it’s OLED (which nicely supports transparency), I hope it’s HD or higher. Anything lower and we’ll see every pixel. I fully expect Apple, the King of Super Retina displays, to get this right. If they don’t, Apple Glass will be a disaster.

Don’t launch without Prescription support

This one is personal. I cannot use or even properly test smart glasses without prescription lens support. Say what you will about Google Glass, but the lack of lenses on the base system made it possible for me to wear my prescription frames along with it (yes, I looked ridiculous).

Apple has an opportunity to get this right out of the box. Making sure you can order Apple Glasses with your prescription and the built-in displays will make them more expensive (I’m guessing $999 to start), but I think it will be worth it.

Bonus don’t

Don’t skimp on memory and storage. It’s a bonus thought because I think it will only matter if Apple doesn’t pair Apple Glasses with your iPhone. If they do, all the CPU, memory, and storage load will be on the phone, not the glasses. If they do not, Apple Glass must start with some sort of new Apple Silicon (could it fit an M1, maybe an M1 mini?), 8GB of RAM, and at least 128 GB of storage.

There is the chance that we won’t see Apple’s AR glasses next week. If that happens, Apple can just keep this advice in its back pocket for when it is finally ready to introduce Apple Glass to the world.

TOPICS
Lance Ulanoff
Editor At Large

A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.

Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Mark, the Today Show, Good Morning America, CNBC, CNN, and the BBC.