Resources

Frequently asked questions

Consolidated FAQ for Extentos — the smart-glasses development layer for Meta Ray-Ban (and future vendors). What it is, how to install it, what it costs (free for development, no paid tier at launch), what hardware it supports (every Ray-Ban Meta and Oakley Meta variant in market), what it collects (aggregate metadata, never end-user content), how it differs from Meta's Device Access Toolkit and Mock Device Kit, and where it's headed. Curated answers with links to the deep pages.

The most-asked questions about Extentos, organized by topic and cross-linked to the deep documentation. If you don't find your question here, see the support page or open an issue on the GitHub repository.

Getting started

Is Extentos free?

Yes. All 18 MCP tools, on-device simulation through Meta's Mock Device Kit, code generation, validation, and real-hardware testing on Ray-Ban Meta are free with no account, forever. The browser simulator at extentos.com/s is free for the first 1000 runtime events on your MCP install; after that, you create a free email-only account (no payment) to keep going. There is no paid tier at launch. See pricing for the full breakdown.

Do I need an Extentos account to start?

No. You can install the MCP server, generate code, scaffold projects, validate specs, and test on real Ray-Ban Meta hardware without ever creating an Extentos account. Account linking only fires when the 1000-event browser-simulator meter exhausts — and even then, the account is free email-only with no payment. See auth for the full anonymous-first model.

How do I install Extentos in my AI coding agent?

The shortest path is the agent prompt: paste "Install @extentos/mcp-server in my MCP config" into Claude Code, Cursor, Windsurf, or Cline. The agent edits the right config file or runs the right command for its host. For Claude Code specifically, the one-liner is claude mcp add extentos -- npx -y @extentos/mcp-server@latest. See install for per-host instructions and config-file locations.

What prerequisites do I need?

Node.js 20 or newer on PATH. That's it for the MCP server — npx fetches and caches the package on first run. To target real Ray-Ban Meta hardware later, you'll also need a Meta Developer Center account (free, ~15-30 minute setup) for the Meta DAT credentials. See vendors/meta for the Meta-specific setup.

Hardware support

What smart glasses does Extentos support?

As of 2026-04, Extentos is GA on Meta Ray-Ban only — every Ray-Ban Meta variant in market (Gen 1 Wayfarer, Headliner, Skyler frame styles; Gen 2 refresh; Ray-Ban Meta Display) plus Oakley Meta HSTN. Mentra G1, Android XR, and Apple smart glasses are tracked roadmap vendors but not yet transport-implemented. See vendors for the full status table.

Does Extentos work with Ray-Ban Meta Display?

Yes. Your Extentos-built app runs on a Display-variant Ray-Ban Meta the same way it runs on Gen 1 or Gen 2 — full camera, voice trigger, audio, and sensor support. What's not available on Display is the heads-up display rendering and the neural-band gesture input, because Meta keeps those as Meta-curated partner surfaces in the DAT toolkit. Third-party apps don't get them, even on Display hardware. See vendors/meta § Compatible models.

Why does Google Maps get to render on the Ray-Ban Meta Display when my app can't?

Display rendering is not exposed in the public Meta DAT toolkit. Google Maps, Spotify, WhatsApp, and other featured integrations are Meta-curated partnerships with privileged access to the display surface. The path to display rendering today is a Meta partnership, not the public DAT toolkit. See vendors/meta § Privileged partnerships.

Do I need real Ray-Ban Meta glasses to develop?

No. Mock Device Kit (via Extentos's LocalSimTransport) and the Extentos browser simulator (BrowserSimTransport at extentos.com/s) cover the bulk of the dev loop. Real hardware verifies the final-mile fit (camera fidelity, BT latency, real-world coexistence) but isn't required for daily iteration. See transport vs app simulation for what each simulator does.

When will Mentra G1 / Android XR / Apple be supported?

No committed timelines. New vendors are added when adding them makes a real product (transport implementation + simulator coverage + tested hardware), which depends on each vendor's SDK maturity and hardware availability. See the changelog for actual release activity. The vendors page tracks roadmap status.

Pricing and accounts

What counts as a "runtime event"?

An event emitted by the browser simulator during a session — trigger.fired, block.executed, callback.invoked, flow.completed, etc. A typical voice-to-AI interaction emits 5–10 events, so 1000 events ≈ 100–200 full interactions. MCP tool calls, code generation, on-device LocalSimTransport activity, and real-hardware testing do not count against this meter, ever. See pricing § What counts as one event.

Does my shipped app cost me anything?

No. Production apps run against real Ray-Ban Meta hardware via RealMetaTransport, which doesn't talk to the Extentos backend at all. Your shipped app's runtime cost from Extentos is zero per end-user. The only Extentos-side cost is dev-time browser-simulator usage above the 1000-event meter, which falls under the free email-only account.

Will Extentos add a paid tier later?

Possibly, as the product matures. Existing free accounts will not be retroactively gated when paid options are introduced — material changes will be communicated via the changelog. Today there is no paid tier.

Can I share an Extentos account across my team?

Single accounts only at MVP. Multi-seat billing and team workspaces are deferred post-launch. Each developer should run their own MCP install with their own account — the per-machine installId is anonymous and doesn't conflict with collaboration.

Capabilities and limits

Can my third-party app trigger on "Hey Meta, [my app], do X"?

No. The "Hey Meta" wake word is system-level and not exposed to third-party apps. Your custom voice trigger is recognized by your phone's speech recognizer (SpeechRecognizer on Android, SFSpeechRecognizer on iOS) over Bluetooth audio — the wake word is your custom phrase, not "Hey Meta" + your phrase. See vendors/meta § Capabilities NOT in the public toolkit.

Can my app use custom gestures (tap, swipe)?

Not via the public Meta DAT toolkit as of 2026-04. The hardware supports tap and swipe; Meta's public API surface doesn't expose them yet. Standard DAT lifecycle events (pause, resume, hinges-closed, thermal warnings) work normally.

What can I actually build for Meta Ray-Ban today?

Photo capture, video capture, camera streams, voice-triggered apps (custom phrases via phone STT), TTS-driven audio apps, sensor/IMU consumers — all on the public DAT toolkit. The full capability matrix is at vendors/meta § What's in the Device Access Toolkit.

Why doesn't Mock Device Kit simulate voice triggers?

Mock Device Kit simulates the BLE/SDK layer — your app thinks it's connected to glasses. Voice-trigger recognition runs on the phone's speech recognizer, not on the glasses, so simulating the glasses doesn't simulate the recognizer. Extentos's browser simulator replaces the whole path with the developer's webcam + microphone, which is why it can exercise voice triggers end-to-end. See transport vs app simulation.

Distribution

Can I publish my Meta Ray-Ban app to the Meta App Store?

As of 2026-04, the Meta App Store is preview-limited to Meta-curated partners. There is no public submission flow open to all developers. Most third-party developers ship via private distribution — APK sideload on Android, TestFlight on iOS, or enterprise channels — with no Meta review needed. See vendors/meta § Distribution state.

How do users install my Meta Ray-Ban app?

Users install your phone app (Android via APK or Play Store; iOS via TestFlight or App Store). The first time they open it, the Meta DAT registration flow runs and pairs the app with their Ray-Ban Meta. Your app is a regular phone app that talks to the glasses over Bluetooth — there's no "install on the glasses" step.

Privacy and data collection

Does Extentos collect my users' photos, video, voice transcripts, or AI prompts?

No, never. Extentos's library is architecturally incapable of inspecting app_callback payloads, AI provider responses, photo bytes, video frames, audio samples, or speech transcripts. The full never-collect list is enumerated in security § What we never collect. End-user content stays between your app and your providers.

Does Extentos see my AI provider API keys?

No. AI provider keys (Anthropic, OpenAI, Google, etc.) live in your app code and your provider's auth headers. The app_callback mechanism is a function pointer your code provides; Extentos invokes it and gets the result back. There's no inspection of inputs, no logging of outputs, no proxy layer. See security § BYOK.

What does Extentos collect, then?

Aggregate metadata — tool calls, install events, vendor adoption rates, capability usage, error rates, library/MCP versions, device models, country (derived from IP, then the IP is dropped). Tagged with anonymous IDs that can never be reversed back to a person. This data powers a developer-facing analytics dashboard (coming soon), Extentos's product roadmap, and aggregate ecosystem signals shared with glasses vendors as market data (10-account aggregation floor). Full schema in security § What we collect.

Will Extentos sell data about my specific app to my competitors?

No. Aggregate ecosystem signals (vendor market share, capability popularity) are sold to glasses vendors only as aggregates over at least 10 distinct accounts, never as individual-developer or individual-end-user data. The 10-account floor is the structural guarantee against competitive leakage.

Can I opt out of telemetry?

Yes — three independent surfaces. (1) EXTENTOS_TELEMETRY=0 env var disables MCP-side telemetry for the current shell. (2) npx @extentos/mcp-server@latest decline-privacy disables it permanently for the install. (3) ExtentosConfig.telemetryConsent = false in your shipped app disables library-runtime telemetry per-app. Any combination works. See security § Three opt-out surfaces.

Architecture and AI agents

Is Extentos a wrapper around Meta's Device Access Toolkit?

Partially. Extentos has three internal transport implementations: RealMetaTransport wraps the real Meta DAT SDK for production hardware; LocalSimTransport wraps Meta's Mock Device Kit for fast on-device testing; BrowserSimTransport is an Extentos-original architecture (browser simulator at extentos.com/s, WebSocket protocol, no DAT involvement) for the headline dev experience. See transport vs app simulation for the deep dive.

Does my code change when targeting different vendors?

No, by design. Your AppSpec is written in vendor-agnostic capability primitives (capture_photo, voice_command, speak_text, etc.). When a new vendor (Mentra G1, Android XR, Apple smart glasses) gets a transport implementation in Extentos, your existing app code runs on it without rewrites. See capabilities § Why a shared vocabulary.

Which AI coding agent works best with Extentos?

All MCP-compatible agents (Claude Code, Cursor, Windsurf, Cline) work — Extentos's tools are protocol-standard. Claude Code has the smoothest install (single CLI command); the others need a small JSON config edit. Tool capabilities are identical across hosts. See install for per-host setup.

What does the AI agent actually do with the MCP server?

The agent calls Extentos's 18 deterministic tools to read your project state, scaffold a connection module, generate a vendor-agnostic AppSpec, stub out handler functions, validate the integration, and provision a browser simulator session — all without you typing any setup code. The canonical sequence is getPlatformInfosearchDocsgenerateConnectionModuleinitSpecgenerateConsumervalidateIntegrationcreateSimulatorSession. See MCP server overview for the full tool catalog.

What happens when the user folds their glasses mid-flow?

Meta DAT emits a hinges_closed hardware event. Extentos surfaces this as a transport.hardware_alert event in the structured event log and (depending on your spec) can dispatch a hinges_closed trigger that fires whatever flow you defined for that case. Apps typically pause active streams and wait for the user to unfold.

Help and resources

Where do I file a bug or feature request?

GitHub issues: github.com/Asgermolgaard/vibe-hardware/issues. Include your install version (npx @extentos/mcp-server@latest whoami), the platform (Android / iOS), the host (Claude Code / Cursor / Windsurf / Cline), and a minimal reproduction.

Where's the source code and license?

Source: github.com/Asgermolgaard/vibe-hardware. The MCP server (@extentos/mcp-server) is MIT-licensed; the Android and iOS libraries are source-available in the same repo. See resources/license for current declarations and pre-1.0 status.

Where do I find release notes?

The changelog tracks notable releases. Pre-1.0 (current @0.0.16) means APIs may shift between minor versions until the hardware test loop closes — pin to an exact version if you need cross-session reproducibility.

On this page

Getting startedIs Extentos free?Do I need an Extentos account to start?How do I install Extentos in my AI coding agent?What prerequisites do I need?Hardware supportWhat smart glasses does Extentos support?Does Extentos work with Ray-Ban Meta Display?Why does Google Maps get to render on the Ray-Ban Meta Display when my app can't?Do I need real Ray-Ban Meta glasses to develop?When will Mentra G1 / Android XR / Apple be supported?Pricing and accountsWhat counts as a "runtime event"?Does my shipped app cost me anything?Will Extentos add a paid tier later?Can I share an Extentos account across my team?Capabilities and limitsCan my third-party app trigger on "Hey Meta, [my app], do X"?Can my app use custom gestures (tap, swipe)?What can I actually build for Meta Ray-Ban today?Why doesn't Mock Device Kit simulate voice triggers?DistributionCan I publish my Meta Ray-Ban app to the Meta App Store?How do users install my Meta Ray-Ban app?Privacy and data collectionDoes Extentos collect my users' photos, video, voice transcripts, or AI prompts?Does Extentos see my AI provider API keys?What does Extentos collect, then?Will Extentos sell data about my specific app to my competitors?Can I opt out of telemetry?Architecture and AI agentsIs Extentos a wrapper around Meta's Device Access Toolkit?Does my code change when targeting different vendors?Which AI coding agent works best with Extentos?What does the AI agent actually do with the MCP server?What happens when the user folds their glasses mid-flow?Help and resourcesWhere do I file a bug or feature request?Where's the source code and license?Where do I find release notes?Related