Meta is rolling out a new software update for its smart glasses that aims to solve one of the most common real-world problems for wearable tech: hearing the person in front of you when the environment is loud. Announced Tuesday, the update adds a “conversation-focus” feature that amplifies a nearby voice through the glasses’ open-ear speakers, alongside a separate feature that connects what you see to what you can play on Spotify.
The changes will first arrive on Ray-Ban Meta and Oakley Meta HSTN smart glasses in the U.S. and Canada, with broader availability expected later as the company expands beyond its Early Access rollout.
A new “conversation-focus” mode for noisy places
The headline addition is a voice amplification tool designed for crowded settings such as restaurants, bars, commuter trains, and other high-noise environments. The feature uses the glasses’ open-ear audio system to boost the speech of the person you’re speaking with, attempting to make dialogue clearer without sealing the ear canal like traditional earbuds.
Meta says wearers will be able to fine-tune the amplification level in two ways:
- By swiping the right temple of the glasses to adjust intensity on the fly
- By changing settings within the companion device controls
The company first previewed the feature earlier this year at its developer and product event, positioning it as a practical quality-of-life improvement rather than a purely entertainment-driven use of wearable AI.
How it compares to Apple’s hearing-focused features
Meta isn’t alone in using consumer audio devices to assist with hearing in everyday conversations. Apple has offered a “Conversation Boost” capability on certain AirPods models, and its newer Pro lineup has increasingly leaned into hearing health functions, including support for a clinical-grade hearing aid feature in select contexts.
That comparison matters because it underscores a broader shift in the wearables market: smart audio is no longer just about music and calls. It’s becoming a layer of “situational enhancement,” where software attempts to improve how people perceive their surroundings—especially speech—without requiring dedicated medical devices.
Still, real-world performance will be the deciding factor. Amplifying speech in noisy spaces is technically challenging, and the user experience depends on how well the system isolates nearby voices and how natural the boosted audio sounds through open-ear speakers.
Spotify gets a visual trigger: play music based on what you see
Alongside the audio update, Meta is adding a feature that connects the glasses’ view to Spotify playback. In practice, the glasses can prompt Spotify to play a song that matches what’s in front of you.
Examples shared with the announcement include:
- Looking at an album cover and playing a track by that artist
- Seeing seasonal décor and cueing up holiday music
While this capability reads more like a novelty than a daily necessity, it signals where Meta wants to take its smart glasses platform: linking visual context to immediate actions inside apps. It’s a step toward “glance-and-do” computing, where recognition and intent are inferred quickly enough to feel effortless.
Where the Spotify feature is available
Meta says the Spotify visual playback feature will be offered in English across a wider list of markets than the conversation-focus tool. Supported countries include Australia, Austria, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, India, Ireland, Italy, Mexico, Norway, Spain, Sweden, the United Arab Emirates, the U.K., and the U.S.
By contrast, the conversation-focus audio enhancement is initially limited to the U.S. and Canada—an important detail for buyers outside North America who may be weighing the glasses primarily for practical day-to-day benefits.
Rollout details: Update v21 and Early Access first
The new capabilities arrive as part of a software package identified as update v21. According to Meta, the first wave will go to members of the company’s Early Access Program, which requires users to join a waitlist and receive approval. After that, the update is expected to roll out more broadly.
This phased approach is typical for wearable features that rely on a combination of device firmware, companion apps, and cloud-connected services. It also allows the company to monitor performance, address edge cases, and refine controls—particularly important for a feature that directly affects how people hear their environment.
Why this matters for smart glasses in 2026
Smart glasses have often struggled to prove they’re more than a tech curiosity, but incremental, practical upgrades can change that equation. A tool that helps users hear conversations in loud places targets a universal pain point, and it nudges smart glasses closer to becoming an everyday accessory rather than an occasional gadget.
At the same time, the Spotify “play what you see” feature illustrates how Meta is trying to make its glasses feel like an extension of the apps people already use—connecting vision, audio, and software actions into a single wearable interface.
Whether these additions become must-have features will depend on how reliably they work outside demos and controlled conditions. For now, Meta is betting that better hearing and faster media actions are the kinds of improvements that can keep smart glasses on people’s faces long after the novelty wears off.

