Can Apple beat Meta's smart glasses by adding cameras and AI to AirPods Pro?
Seeing out your ears
After Meta made AI wearables a centerpiece of its announcements at Meta Connect 2024, the question arises of how its rivals will respond. In particular, there's a lot of rumors flying around about what Apple has planned in response to the upgraded Ray-Ban Meta Smart Glasses and upcoming Orion smart glasses that employ augmented reality along with AI.
One major long-running rumor is that Apple plans to incorporate the hardware and AI software for a wearable not into glasses, but into its next-generation AirPods Pro. That might include cameras and AI features to match what you see in Meta's smart glasses. This potential competition sets the stage for a battle of wearables, with both companies seeking to redefine how users interact with the digital and physical worlds.
Meta boasted that the upgraded Ray-Ban Meta smart glasses can take photos, live stream video, and otherwise provide hands-free access to the world with both audio and visual receptors controlled by voice commands. With the Meta AI assistant integrated into the device, the smart glasses can handle requests in a conversational form. However, they are only a shadow of what the Orion smart glasses previewed at the event could do. Orion will employ augmented reality to meld digital content with the physical world through a holographic display.
Smart Futures
Apple's approach with the speculative AirPods Pro is more about leveraging AI for contextual awareness, using infrared cameras to interpret the space around you. It wouldn't take photos or video from the perspective of your ears. Rather, it would use visual input to subtly improve navigation, fitness tracking, and even better respond to gesture controls. They would also likely augment Apple's Vision Pro headset, giving even more accurate spatial audio experiences by tracking head movements and adjusting audio based on the user's surroundings.
The best way of thinking about the difference between Meta's smart glasses and Apple's AirPods with AI tools is how they connect the AI with the hardware. They might divide prospective users based on what they want from their wearables. Meta's Ray-Ban glasses are geared toward capturing and sharing experiences visually, while Apple's AirPods seem to be more about enhancing the AI assistant with more passive input than is currently available. What they share is an interest in immersive experiences and making AI an ever-present aspect of wearing technology.
While Orion may link the physical and digital worlds through augmented reality, Apple's AirPods offer a lighter touch in enhancing environmental awareness and boosting the AI assistant's ability to help you. However, the integration into the Apple headset gives more weight to the AirPods and their capacity to deliver AI experiences. That's especially true if they can sync up with Apple’s extensive ecosystem of devices.
You Might also like
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Eric Hal Schwartz is a freelance writer for TechRadar with more than 15 years of experience covering the intersection of the world and technology. For the last five years, he served as head writer for Voicebot.ai and was on the leading edge of reporting on generative AI and large language models. He's since become an expert on the products of generative AI models, such as OpenAI’s ChatGPT, Anthropic’s Claude, Google Gemini, and every other synthetic media tool. His experience runs the gamut of media, including print, digital, broadcast, and live events. Now, he's continuing to tell the stories people want and need to hear about the rapidly evolving AI space and its impact on their lives. Eric is based in New York City.