The Ray-Ban Meta Smart Glasses are getting a welcome camera and audio update

RayBan Meta Smart Glasses
(Image credit: Meta)

Meta is rolling out software update Version 2.0 to its Ray-Ban Meta Smart Glasses, and it's set to deliver some very welcome image-quality and audio quality-of-life improvements.

Firstly the smart glasses’ cameras will get some low-light performance upgrades, with reduced noise, and auto exposure leading to sharper images, or so Meta promises – we haven’t been able to test this yet. You should also find videos you capture when you're moving – Meta calls these 'on-the-go captures' – have more dynamic range and are sharper.

As for the audio, Meta is adding an audio master control to enable you to adjust the volume of sounds from your glasses. Once update Version 2.0 has hit your Ray-Ban Meta Smart Glasses you’ll be able to control the volume of music, voice commands, and other sounds by swiping up and down on the glasses’ touchpad found by your temple – previously you could only control music volume on the glasses.

Lastly, Meta’s official changelog says the update will deliver “Security and stability improvements” – though it hasn’t explained in detail what this means for the smart specs.

A blue pair of the Ray-Ban Meta Smart Glasses Collection on a wooden table in front of their charging case

Get ready for better snaps from your smart specs (Image credit: Future)

The update takes a few minutes to install, and if you don’t have auto-updates turned on you’ll need to open up the Meta View app on your phone, tap on the picture of your glasses in the top-left corner, tap on ‘Glass & privacy’ in the menu, select ‘Your glasses’, then ‘Updates’, and finally ‘Install update’ if your device finds the Version 2.0 update.

If it doesn’t find it, don’t worry – it usually takes time for updates to roll out to everyone. Make sure to check back later, or turn on auto-updates so that you don't have to keep checking if the update is ready yet.

Also be sure to turn on your Ray-Ban Meta Smart Glasses and connect them via Bluetooth to your phone, otherwise they won’t be able to install the Version 2.0 software.

Looking and Asking for more 

Unfortunately, while the Version 2.0 update includes some camera and audio updates the specs needed, it doesn’t include the long-anticipated rollout of the Ray-Ban Smart Glasses’ Look and Ask AI recognition tools to people outside of the US-exclusive beta.

This feature was for many the standout tool at Meta’s launch presentation at Meta Connect 2023 back in September last year. Using the glasses’ camera, the Meta AI can scan your environment to respond to your questions, like a combination of Google Lens and ChatGPT.

Orange RayBan Meta Smart Glasses in front of a wall of colorful lenses including green, blue, yellow and pink

A flurry of improvements, but not the one we most want yet (Image credit: Meta)

You start with “Hey Meta, look and…” then follow up with something like "What can I make with these ingredients?” or “How much water does this plant need?” or "Translate this into English.” The AI can then search its data banks and the internet (via Bing) to source an answer to your question, using its camera to help it identify food, flowers, or French as required.

The software updates Meta has been rolling out have helped to improve the Ray-Ban Smart Glasses, but until these promised AI tools are available to everyone they won’t feel complete. We’ll just have to hope they come in Version 3.0, and that the next update isn’t too many months away from launching.

You might also like...

Hamish Hector
Senior Staff Writer, News

Hamish is a Senior Staff Writer for TechRadar and you’ll see his name appearing on articles across nearly every topic on the site from smart home deals to speaker reviews to graphics card news and everything in between. He uses his broad range of knowledge to help explain the latest gadgets and if they’re a must-buy or a fad fueled by hype. Though his specialty is writing about everything going on in the world of virtual reality and augmented reality.