Meta just flipped the biggest switch yet for its smart glasses. Starting today, outside developers can finally build apps for the Ray-Ban Display, the company’s $799 face-worn computer with a tiny screen tucked into the right lens. The move turns what was a closed Meta-only gadget into something that could actually live up to the AR hype.
What Meta Just Unlocked for Developers
The announcement came straight from Meta CTO Andrew “Boz” Bosworth on May 14, 2026. For the first time, third parties can push real apps, overlays, and micro-experiences directly onto the in-lens display.
Meta is opening up development for its smart glasses with an integrated display, giving third-party developers a chance to build new software for the $800 wearable. The company said developers can create experiences that use the monocular display and Neural Band wrist controller, with support for connected iOS and Android apps or web apps.
There are two clear paths to ship something. The first is through the Meta Wearables Device Access Toolkit, a native mobile SDK for iOS and Android. It lets developers extend their existing app to the glasses’ display using Swift or Kotlin, adding UI components such as text, images, lists, buttons, and video playback. The second path is web apps, which are perfect for developers who want to build standalone experiences using standard HTML, CSS, and JavaScript. They can build everything from cooking guides to transit tools, test them in a browser, and deploy directly to the glasses.
meta ray-ban display smart glasses third party apps
“The gap between idea and prototype has never been smaller. Add glasses and inputs like the Neural Band, and it feels like the early days of building in a way we haven’t seen in over a decade.” Andrew Bosworth, Meta CTO
Why This Matters for Ray-Ban Display Owners
If you bought a pair last fall, your glasses are about to get a lot more useful. Until now, the display mostly showed Meta’s own stuff like AI replies, messages, and a viewfinder.
Letting outside developers build apps could help fix one of the biggest issues people had with the glasses after launch: they just didn’t do much. Meta has added features like a teleprompter and handwriting recognition over time, but a lot of early users still felt boxed in by the limited software. Cooking became a good example of that problem, since the only way to see a recipe on the display was by asking Meta AI instead of using a proper recipe app.
Now picture turn-by-turn directions, your grocery list, live game scores, or a club recommendation on the golf course, all floating in the corner of your vision. No phone pull-out needed.
Examples of what’s coming:
- Darkroom Buddy: An interactive guide for developing film that could serve as a “glanceable” reference.
- 18Birdies: The golf app is experimenting with using Meta Wearables Device Access Toolkit for real-time yardages and club recommendations, helping golfers without requiring them to take their phone out of their pocket.
- Disney Imagineering: Disney’s Imagineering team are exploring using the toolkit to give guests a personal AI guide in Disney parks.
- Twitch & Streamlabs: Hands-free first-person livestreaming straight from the frames.
The Neural Band and Six Months of User Data
Meta also dropped fresh numbers on how people actually use the Display in real life. The built-in viewfinder is changing behavior, with users sharing more photos and videos because framing the shot finally feels natural.
Visual answers from Meta AI are also pulling higher engagement, meaning wearers are firing off more follow-up questions instead of giving up after one reply. The Neural Band, included with every pair, translates subtle finger muscle signals into commands for the glasses. That wrist input is the secret sauce. Tiny pinches and taps replace big arm waves, so you can scroll a list in a coffee shop without looking like you are conducting an orchestra.
The Ray-Ban Display’s in-lens screen is designed for short, user-initiated glances checking messages, reading translations, previewing photos, querying Meta AI and is positioned off to the side so it doesn’t obstruct the wearer’s view.
The Catch: Limits, Permissions, and Open Questions
This is still a developer preview, not a free-for-all app store. Publishing of integrations will be limited to select partners during the developer preview while Meta focuses on building, testing, and gathering feedback. The company expects to release the ability to publish to the broader community in 2026.
| Feature | Status at Launch |
|---|---|
| Camera access | Available |
| Microphone & speakers | Via Bluetooth profiles |
| Display rendering (HUD) | Now open via new SDK |
| “Hey Meta” custom commands | Not supported |
| Custom Neural Band gestures | Not yet |
| Video stream max | 720p at 30 FPS |
For a video stream, the maximum resolution is 720p and the maximum frame rate is 30 FPS, a limitation related to the use of Bluetooth. And when Bluetooth bandwidth is limited, the resolution and frame rate will be automatically reduced.
There are real privacy questions too. Granting an app camera access on glasses is not the same as on a phone. It is a first-person view of someone’s whole day. Meta is leaning on a permission-on-demand model similar to iOS location services, but the trust test is just starting.
What This Means for the Smart Glasses Race
This is Meta’s clearest shot yet at building the “Android of the face.” Samsung is reportedly showing its Galaxy Glasses at Unpacked in July. Apple is rumored to be circling its own pair. Snap is iterating on Spectacles. The company that wins developers wins the platform.
Ray-Ban Meta glasses have set a new benchmark in wearable technology, achieving millions in sales worldwide and leading a cultural shift by seamlessly blending iconic fashion with cutting-edge innovation. Opening the display to outside builders takes that lead and tries to lock it in before rivals catch up.
For shoppers on the fence, the math just changed. A pair of $799 glasses that mostly answered questions felt like a luxury toy. A pair that runs your favorite recipe app, navigation tool, and fitness tracker on the lens looks more like the next phone replacement.
Meta’s Connect conference later this year is the next big checkpoint, and Mark Zuckerberg has already teased a follow-up frame. Whether developers actually show up in numbers, build apps people love, and survive Meta’s permission gates will decide if today’s announcement becomes a turning point or just another preview. For now, the door is open wider than it has ever been, and that alone feels like the start of something real. Share your thoughts in the comments and tell us which app you would build first if you had the SDK in your hands today.