Vendors

Meta Ray-Ban (Meta DAT)

Complete Meta Ray-Ban developer guide — hardware tiers (Ray-Ban Meta, Ray-Ban Meta Gen 2, Ray-Ban Meta Display), the Meta Device Access Toolkit (DAT) public capability matrix, distribution and development state, what you can build and ship today vs what's still gated behind Meta partnerships, and how Extentos abstracts the toolkit so the same code runs against the simulator and real glasses.

Meta Ray-Ban is the GA target for Extentos. The hardware ships in three tiers as of 2026 — original Ray-Ban Meta, Ray-Ban Meta Gen 2, and Ray-Ban Meta Display (the variant with a heads-up display and a neural-band wrist controller). Third-party apps reach the glasses through Meta's Device Access Toolkit (DAT) — a public Android and iOS SDK that exposes camera, microphone, audio, and sensor capabilities via Bluetooth, plus a Mock Device Kit for testing without hardware. This page is the consolidated capability matrix, the distribution-and-development state, and the practical answer to "what can I actually build and ship for Meta Ray-Ban as a third-party developer in 2026?"

TL;DR

What you can build todayStatus
Photo / video / camera-stream apps via DAT✅ GA on public toolkit
Voice-triggered apps (custom phrases via phone STT)✅ GA
TTS-driven audio apps (phone TTS played over BT)✅ GA
Sensor / IMU data consumers✅ GA
Apps that use the Display (heads-up rendering)❌ Restricted — Meta-curated partners only as of 2026-04
Apps that hook the "Hey Meta" wake word❌ Not exposed to third parties
Apps that listen for custom gestures (tap, swipe)❌ Not in public preview
Distribution via Meta App Store⚠️ Preview-limited as of 2026-04 — most apps ship private/sideload
Distribution via APK sideload / TestFlight / private✅ Works today, no Meta review needed

The short version: a third-party developer can build and privately ship rich photo/video/voice/audio apps for Ray-Ban Meta today using the public DAT toolkit. Display rendering, Hey-Meta-style wake words, and Meta App Store distribution are gated behind partnership status that Meta hands out selectively.

Compatible models

Extentos runs on every Meta smart-glasses model in market. Compatibility is determined by Meta's Device Access Toolkit (DAT) — and DAT exposes the same camera, microphone, speaker, and sensor surface uniformly across every model. Frame styles within a model line don't change what Extentos can do; the same Wayfarer-style Ray-Ban Meta and Headliner-style Ray-Ban Meta have identical capabilities to Extentos.

Model lineFrame styles / variantsCamera / Voice / Audio / SensorsDisplay renderingNeural-band gestures
Ray-Ban Meta (Gen 1, 2023)Wayfarer, Headliner, Skyler, and other Ray-Ban frame styles✅ Fully supportedn/a (no display)n/a (no neural band)
Ray-Ban Meta Gen 2 (2024)Refreshed frame line, improved low-light camera✅ Fully supportedn/a (no display)n/a (no neural band)
Ray-Ban Meta Display (2025)Single variant with monocular HUD + Meta neural band✅ Fully supported❌ Meta-curated partners only❌ Not in DAT public toolkit
Oakley Meta HSTN (2025)Sport-focused variant of the same hardware platform✅ Fully supportedn/a (no display)n/a (no neural band)

What this means for the Display model

This is the most-asked compatibility question. Extentos runs on the Ray-Ban Meta Display the same as it runs on any other Ray-Ban Meta — your camera capture, voice triggers, audio playback, sensor reads, and hardware events all work identically. The wearer can use a Display-variant Ray-Ban Meta with your Extentos-built app and get the full app experience.

What you don't get on a Display variant is access to the heads-up display surface or the neural-band gestures. Those are Meta-controlled partner surfaces — Meta exposes them only to curated integrations like Google Maps and the small number of other featured apps, and they aren't available through the public DAT toolkit. If a wearer has a Display variant and your app uses voice triggers and camera capture, they get exactly the same experience as someone wearing a Gen 1 or Gen 2 — there's no degraded path.

If you want display rendering or neural-band gesture access, the path today is a Meta partnership, not the public DAT toolkit. See distribution state for the partnership picture.

Compatibility approach

Extentos doesn't maintain a model-by-model verification matrix because the underlying SDK (Meta DAT) is the same across every Ray-Ban Meta and Oakley Meta variant. A capability that works on one model works on all of them, modulo hardware tier. New Meta-branded smart-glasses models will work without code changes if Meta keeps using DAT — which has been the consistent pattern through 2024 and 2025.

As of 2026-04-30 — model lineup, frame styles, hardware specs, and toolkit boundaries are pre-publication notes. Verify against Meta's Developer Center, the DAT GitHub repository, and the current Meta smart-glasses product page before quoting publicly. Newer models released between this writing and Meta's next major launch should slot into the same compatibility story automatically.

What's in the Device Access Toolkit (DAT)

Meta DAT is the official public SDK for talking to Ray-Ban Meta from third-party Android and iOS apps. As of 2026-04 the toolkit exposes:

Capabilities (GA)

CapabilityAndroid APIiOS APINotes
Photo captureStreamSession.capturePhoto()streamSession.capturePhoto()photoDataPublisherUp to 12 MP, JPEG. Short-lived stream pattern.
Video capture (clip)DAT clip APIDAT clip APIRecorded to glasses storage, transferred over BT.
Video frame streamSession.addStream(StreamConfiguration)streamSession.videoFramePublisherConfigurable resolution and frame rate; LOW/2 fps is typical for vision pipelines.
Audio chunk stream (mic)AudioRecord over BLUETOOTH_SCOAVAudioEngine over .allowBluetooth routeMic audio captured on the glasses, streamed to the phone over Bluetooth HFP/SCO. Recognition runs on the phone.
Audio playback (TTS / earcon)TextToSpeech + A2DPAVSpeechSynthesizer + AVAudioSessionAudio synthesized on the phone, played out the glasses speakers via Bluetooth A2DP.
Sensor data (IMU)DAT sensor APIDAT sensor APIAccelerometer, gyroscope, magnetometer.
Hardware eventsDAT error streamDAT error publisherHinges-closed (user folded the glasses), thermal warnings, audio-route changes.
Mock Device Kitmwdat-mockdevice artifactMWDATMockDevice SPM moduleSimulates the BLE/SDK layer; test on phone or emulator without paired glasses.

Capabilities NOT in the public toolkit

These are real Ray-Ban Meta features but not third-party-accessible:

  • "Hey Meta" wake word — system-level, owned by Meta AI. Custom voice triggers must use the phone's speech recognizer over Bluetooth audio, not the glasses' wake-word engine.
  • Heads-up display rendering (Display variant) — Meta controls the display surface. Third-party DAT apps can only listen for sensor/audio/camera events, not draw UI to the display.
  • Custom tap / swipe / multi-finger gestures — not in public preview as of 2026-04. Standard DAT lifecycle events (pause, resume, stop) are listenable, but custom gesture recognition is not exposed.
  • Always-on background audio routing — apps cannot keep the SCO mic open continuously; the toolkit and phone OS impose lifecycle restrictions.

Why these matter for app design: voice triggers in your app must be a custom phrase recognized by your phone's speech recognizer (SpeechRecognizer on Android, SFSpeechRecognizer on iOS) over Bluetooth audio, not "Hey Meta, my app, do X". The glasses are the microphone; the phone is the recognizer.

Distribution state (2026-04)

This is the part that surprises new Meta Ray-Ban developers. Distribution today has two paths, and the obvious one isn't open yet.

Path 1 — Private distribution (works today)

You can ship a Ray-Ban Meta app to anyone with a paired Ray-Ban Meta and an Android phone or iPhone, today, with no Meta review:

  • Android: sideload the APK directly, distribute via internal track, or push through enterprise channels. Users install your app on their phone; the DAT-based glasses-pairing flow runs the first time they open it.
  • iOS: TestFlight (up to 10,000 testers), enterprise distribution, or AdHoc for small teams. Same flow on the phone side.
  • No Meta App Store listing required. Your app is a regular phone app that also talks to the glasses via DAT.

This is how the vast majority of third-party Ray-Ban Meta apps ship today. It's the path Extentos optimizes for.

Path 2 — Meta App Store / Meta Horizon Store (limited)

Meta runs an app store surfaced on the glasses companion app (Meta AI / Meta View). As of 2026-04 it is preview-limited — only Meta-curated partners are listed there. There's no public submission flow open to all developers. Developers can apply for partnership but approval is gated.

The Meta App Store is also where the Display-rendering capability lives — apps that draw to the heads-up display on the Ray-Ban Meta Display variant must be Meta-store-listed and approved. Privately-distributed third-party apps cannot draw to the display, even on hardware that has it.

As of 2026-04-30 — Meta App Store policy is the area most likely to evolve. Meta's annual Connect conference (typically September) has historically been when new developer-program tiers are announced. Verify the current state before shipping copy that quotes specific store policy.

Privileged partnerships (the Google Maps question)

A small set of apps are featured as Meta-curated integrations on Ray-Ban Meta and Ray-Ban Meta Display. These have privileges the public DAT toolkit doesn't expose — most notably display rendering on the Display variant.

Public examples that have been demoed or shipped as Meta-partnered integrations:

  • Google Maps (Ray-Ban Meta Display) — turn-by-turn navigation rendered on the heads-up display
  • Spotify — voice-driven music control with glasses-side feedback
  • WhatsApp — voice-driven messaging
  • Audible — voice-driven audiobook playback

As of 2026-04-30 — partner list is illustrative and based on public demonstrations through early 2026. Verify the current featured-app list at meta.com/smart-glasses before quoting specific names publicly.

These integrations are the visible tip of "what Ray-Ban Meta apps can do." A new third-party developer should not assume they'll have the same access; the public DAT toolkit is the practical surface for ~99% of developers as of 2026-04.

Development state — what you can test with today

Building and testing an app for Ray-Ban Meta in 2026 is more accessible than the distribution story suggests. The development tools are open:

ToolWhat it doesCost
Meta DAT SDKThe real toolkit — Android (mwdat-core, mwdat-camera) and iOS (MWDATCore, MWDATCamera)Free
Meta Mock Device KitSimulates the BLE/SDK layer on phone or emulatorFree, ships with DAT
Meta Developer Center registrationOne-time signup with your personal Meta credentials; gives you MetaAppID, ClientToken, TeamIDFree, ~15–30 min setup
Real Ray-Ban Meta hardwarePair to your phone over Bluetooth using the Meta AI / Meta View app, then your DAT app uses itCosts of the glasses
Extentos browser simulatorApp-layer simulator at extentos.com/s — tests the wearer experience without real hardwareFree for 1000 events, free email-only account after
Extentos LocalSimTransportWraps Meta's Mock Device Kit in the Extentos abstraction; same glasses.* API works against simulator and real glassesFree forever, no account

You don't need permission from Meta to start developing. Sign up at the Meta Developer Center, drop the DAT SDK into your Android or iOS app (or use Extentos which wraps it), and you can test today against Mock Device Kit, Extentos's browser simulator, or your own paired Ray-Ban Meta.

The distribution gate is the hard one. Development is open.

Required setup for development

Concrete checklist for a third-party Meta Ray-Ban app, regardless of whether you use Extentos:

Both platforms

  • Meta Developer Center account (one-time, free)
  • Meta App ID and Client Token from the Developer Center
  • A paired Ray-Ban Meta to test on real hardware (or Mock Device Kit / Extentos simulator instead)

Android

  • Min SDK: as required by DAT (typically API 26+)
  • Maven artifacts under the com.meta.wearable group (singular): com.meta.wearable:mwdat-core, com.meta.wearable:mwdat-camera, optional com.meta.wearable:mwdat-mockdevice for testing. Pinned to v0.5.0 in current Extentos builds.
  • Distributed via Meta's GitHub Packages with token-based access setup (Maven Central promotion is Meta's roadmap, not current state)
  • AndroidManifest permissions: BLUETOOTH_CONNECT, BLUETOOTH_SCAN, BLUETOOTH_ADMIN, RECORD_AUDIO, CAMERA (depending on capabilities)
  • Signature registered with the Meta App ID for production

iOS

  • Min iOS: 15.2 (Meta DAT minimum). Swift 6 required for current DAT.
  • Swift Package Manager dependency on https://github.com/facebook/meta-wearables-dat-ios
  • SPM modules: MWDATCore, MWDATCamera, MWDATMockDevice for testing. Pinned to from: "0.6.0" in current Extentos builds.
  • Info.plist keys:
    • MWDAT dictionary with MetaAppID, ClientToken, TeamID, AppLinkURLScheme (a custom URL scheme — not a universal link)
    • CFBundleURLTypes matching the AppLinkURLScheme
    • LSApplicationQueriesSchemes includes fb-viewapp (to query the Meta AI app)
    • UISupportedExternalAccessoryProtocols includes com.meta.ar.wearable
    • UIBackgroundModes includes bluetooth-central, bluetooth-peripheral, external-accessory
    • Privacy strings: NSCameraUsageDescription, NSMicrophoneUsageDescription, NSSpeechRecognitionUsageDescription, NSBluetoothAlwaysUsageDescription
  • The companion-app registration callback returns to your app via the configured URL scheme — Wearables.shared.handleUrl(_:) must be wired in onOpenURL for the registration flow to complete

Extentos generates the boilerplate for all of this through the MCP server's generateConnectionModule tool — see Quickstart with an AI agent.

Audio architecture (developer-relevant)

This is a recurring source of confusion that's worth pinning down explicitly:

  • Microphone capture path: the glasses microphone physically captures audio. That audio streams to the phone over Bluetooth via the HFP/SCO profile (call-style bidirectional). Speech recognition runs on the phone, not on the glasses, using the platform's native recognizer.
  • Speaker playback path: TTS audio is synthesized on the phone, then routed back to the glasses speaker via Bluetooth A2DP. Synthesis is a phone job; playback is glasses output.
  • Coexistence constraint: A2DP (high-quality stereo) and HFP/SCO (low-latency bidirectional) on the same BT connection compete. When a video stream is active, audio routing may downgrade automatically. The DAT SDK and RealMetaTransport handle this; Extentos surfaces coexistence.warning events when it happens so you can see it in the simulator.
  • Zero Extentos runtime cost. Both platforms handle the hard parts (recognition, synthesis) natively. The Extentos library wires the Bluetooth audio session so the right mic and speaker are in the path. Your shipped app pays nothing to Extentos for voice or audio at runtime.

Roadmap and what to expect

As of 2026-04-30 — this section is forward-looking and unhedged. Update annually after Meta Connect.

Based on the trajectory of the Meta DAT toolkit through early 2026:

  • Capability adds are roughly quarterly. Meta has shipped audio routing, video streams, sensor capabilities, and Mock Device Kit additions over 2024–2025. The cadence is steady.
  • Public Display rendering may open at some point, likely tier-gated (Meta-approved apps first, then broader). No public timeline as of 2026-04.
  • Custom gestures are likely future-toolkit work — the hardware supports tap and swipe; only the public API surface is missing.
  • Meta App Store is the hardest one to predict. Meta has indicated developer-program expansion plans without committing to specific dates.
  • Hey Meta integration is unlikely to open to third parties — the wake-word engine is core to Meta AI's product positioning.

Plan for the public DAT toolkit as it stands. Treat private distribution as the production path. Watch Meta Connect (September) for major announcements.

How Extentos relates

Extentos sits on top of Meta DAT and abstracts it:

  • RealMetaTransport wraps the real DAT SDK (mwdat-core / MWDATCore + mwdat-camera / MWDATCamera). Your app code talks to glasses.camera.capturePhoto(); Extentos translates to DAT calls.
  • LocalSimTransport wraps Meta's Mock Device Kit (mwdat-mockdevice / MWDATMockDevice). Same glasses.* API, no real glasses needed, no network.
  • BrowserSimTransport is Extentos-original — a browser-based simulator that tests the wearing experience (voice triggers, TTS, hardware alerts) at the app layer rather than the BLE layer. See transport vs app simulation for the framing.

When Meta opens new public-toolkit capabilities, Extentos adds them to the abstraction. When Meta opens display rendering or custom gestures, Extentos's AppSpec and the simulator both extend to cover them. Your code doesn't change shape.

When other vendors enter the market — Mentra G1, Android XR, Apple smart glasses — Extentos ships additional transports. The AppSpec format stays vendor-agnostic. Your existing Meta Ray-Ban app code can target a future vendor with a config change, not a rewrite.

Frequently asked questions

Can I publish my Meta Ray-Ban app to the Meta App Store?

As of 2026-04, the Meta App Store is preview-limited to Meta-curated partners. Most third-party developers ship via private distribution — APK sideload on Android, TestFlight on iOS, or enterprise channels. This works today and requires no Meta review.

How do users install my Meta Ray-Ban app?

Users install your phone app (Android via APK or Play Store; iOS via TestFlight or App Store), then your app pairs with their Ray-Ban Meta the first time they open it via the DAT registration flow. Your app is a regular phone app that talks to the glasses over Bluetooth.

Why does Google Maps get to render on the Ray-Ban Meta Display when my app can't?

Display rendering is not exposed in the public DAT toolkit. Google Maps and other featured integrations are Meta-curated partnerships with privileged access to the display surface. Most third-party apps cannot draw to the display, even on Display-variant hardware. Whether this opens up later is unannounced.

Can my third-party app trigger on "Hey Meta, [my app], do X"?

No. The "Hey Meta" wake word is system-level and not exposed to third parties. Use a custom voice trigger phrase that's recognized by your phone's speech recognizer over Bluetooth audio. Extentos's voice-trigger primitive does this for you.

Does my app work on all Ray-Ban Meta and Oakley Meta variants?

Yes. Extentos runs on every model in market — Ray-Ban Meta Gen 1 (Wayfarer, Headliner, Skyler, and other frame styles), Ray-Ban Meta Gen 2, Ray-Ban Meta Display, and Oakley Meta HSTN. Frame style doesn't affect capabilities. Camera, microphone, speaker, and sensor support is identical across every model.

Does Extentos work with Ray-Ban Meta Display?

Yes. Your Extentos-built app runs on a Display-variant Ray-Ban Meta the same way it runs on Gen 1 or Gen 2 — full camera, voice trigger, audio, and sensor support. What's not available on Display is the heads-up display rendering and the neural-band gesture input, because Meta doesn't expose those in the public DAT toolkit; they're reserved for Meta-curated partner integrations like Google Maps. Your third-party app on a Display wearer gets exactly the same experience it gives a Gen 2 wearer — no display rendering either way, no degradation either way.

What Meta smart-glasses models does Extentos support?

All of them. Compatibility is determined by Meta's Device Access Toolkit (DAT), which is uniform across every Ray-Ban Meta and Oakley Meta variant. Extentos works on Ray-Ban Meta Gen 1 frame styles (Wayfarer, Headliner, Skyler), Ray-Ban Meta Gen 2, Ray-Ban Meta Display, and Oakley Meta HSTN. Newer Meta-branded models that ship after this writing will work without code changes as long as Meta keeps using DAT — which has been the consistent pattern.

Do I need a paid Meta Developer account to ship a Meta Ray-Ban app?

No. The Meta Developer Center account is free. The DAT SDK is free. Mock Device Kit is free. Meta charges nothing for using the public toolkit.

What happens to my app when the user folds their glasses?

The DAT SDK emits a hinges_closed hardware event. Extentos surfaces this as a transport.hardware_alert event in the event log and (depending on your spec) can dispatch a hinges_closed trigger. Apps typically pause active streams and wait for the user to unfold.

Is real-hardware testing required to develop a Ray-Ban Meta app?

No. Mock Device Kit (via Extentos's LocalSimTransport) and the Extentos browser simulator (BrowserSimTransport at extentos.com/s) cover the bulk of the dev loop. Real hardware verifies the final-mile fit — fidelity of the camera frames, latency of the voice loop, real-world A2DP/HFP coexistence — but is not required for daily iteration.

What about Apple smart glasses?

Apple has announced AR glasses development but has not shipped a third-party SDK as of 2026-04. Extentos plans to add an Apple transport once Apple opens a developer toolkit; your AppSpec won't need to change.

What about Mentra G1 and Android XR?

Both are tracked on the Extentos vendor roadmap. The AppSpec is vendor-agnostic by design — adding a vendor means adding a transport implementation, not changing app code or specs.