MCP server

Credentials checklist

Provider keys you'll need before deploying. Extentos itself is free and needs no provider keys for simulator use; AI provider keys (Anthropic, OpenAI, Google, DeepL, etc.) are bring-your-own-key (BYOK) and only needed when your spec's app_callback handlers actually call those services. Meta Developer Center credentials are needed only when shipping to real Meta Ray-Ban hardware. This page is the quick-reference checklist for the user's coding agent — surface the relevant items proactively while scaffolding the spec, not at deploy time.

Extentos itself needs no provider keys. The MCP server, simulator, and code generation work for free with no signup at all (the auth flow handles linking later).

Provider keys come into the picture in two places:

  • app_callback handlers — the spec's escape hatch where developer code runs. If your handlers call Anthropic / OpenAI / DeepL / etc., they need keys. Extentos never sees these keys; they live in the developer's process.
  • Real Meta Ray-Ban hardware — testing or shipping on physical glasses (not just the browser simulator) requires the developer's own Meta Developer Center registration.

Surface the relevant entries from this page while scaffolding the spec, not at deploy time. If a developer is writing a translation handler, mention DeepL or Google Translate in the same exchange — not three days later when they try to ship.

Quick decision table

If your spec includes…You need…
speak_text action (simulator only)Nothing — Extentos voice proxy handles dev-time TTS
speak_text action (real hardware)Nothing — phone-native TTS; optional premium TTS provider
Voice trigger (simulator only)Nothing — Extentos voice proxy + browser STT
Voice trigger (real hardware)Nothing — phone-native STT
app_callback calling an LLMAnthropic, OpenAI, Google Gemini, or another LLM provider
app_callback calling a vision APIOpenAI Vision, Google Vision, Anthropic, Azure, or AWS Bedrock
app_callback calling translationDeepL, Google Translate, or another translation provider
app_callback calling OCRGoogle Vision, Azure, or another OCR provider
Premium voice synthesis in productionElevenLabs, Azure Neural, Play.ht, or similar
Real Meta Ray-Ban hardware test or shipMeta Developer Center app registration

The simulator path (rows 1, 3) intentionally needs zero credentials — that's the point of Extentos's anonymous-first model. Production keys come up only when the developer leaves the simulator.

LLM providers

Only one is required, pick whichever the developer prefers. All have free tiers usable for development.

Anthropic (Claude)

  • Portal: console.anthropic.com/settings/keys
  • Env var convention: ANTHROPIC_API_KEY
  • Free tier: $5 of credit on signup; pay-as-you-go after.
  • Best for: Long-context reasoning, tool use, code generation in handlers.

OpenAI

  • Portal: platform.openai.com/api-keys
  • Env var convention: OPENAI_API_KEY
  • Free tier: Limited free credits on signup; pay-as-you-go after. Multi-modal (text + vision + speech) under one key.
  • Best for: Vision through GPT-4 Vision, transcription via Whisper, generic chat.

Google Gemini

  • Portal: aistudio.google.com/app/apikey
  • Env var convention: GEMINI_API_KEY
  • Free tier: Generous free quota (1500 requests/day on Flash, lower on Pro). Best free option for prototyping.
  • Best for: Cost-sensitive prototyping, multi-modal, fast iteration before committing to a paid provider.

Vision APIs

If the spec captures photos and the handler needs to interpret them, pick one:

ProviderPortalEnv var
OpenAI Vision (GPT-4o)platform.openai.com/api-keysOPENAI_API_KEY
Google Visionconsole.cloud.google.com/apis/credentials (enable Cloud Vision API)GOOGLE_API_KEY or service account
Anthropic Claude Visionconsole.anthropic.com/settings/keysANTHROPIC_API_KEY
Azure Computer Visionportal.azure.com → Computer Vision resourceAZURE_VISION_KEY + endpoint
AWS Bedrock (Claude/Nova)console.aws.amazon.com/bedrockAWS access key + secret + region

For most specs, just reuse the LLM provider's key — GPT-4o, Claude, and Gemini all do vision under the same key. Separate vision-only providers only matter when you need their specific capabilities (Google's high-accuracy OCR, Azure's domain models).

Translation

For multilingual specs:

ProviderPortalEnv var
DeepLdeepl.com/pro-apiDEEPL_API_KEY
Google Translateconsole.cloud.google.com/apis/credentials (enable Translation API)GOOGLE_API_KEY
HuggingFace Inferencehuggingface.co/settings/tokensHF_TOKEN

DeepL is generally higher quality for European languages; Google Translate has wider language coverage. HuggingFace Inference works for self-hosted or community models.

OCR

If the handler reads text out of camera frames:

ProviderPortalEnv var
Google Vision (text detection)console.cloud.google.com (enable Cloud Vision API)GOOGLE_API_KEY
Azure Computer Vision (Read API)portal.azure.comAZURE_VISION_KEY + endpoint
AWS Textractconsole.aws.amazon.com/textractAWS access key + secret + region

Most LLMs with vision (GPT-4o, Claude, Gemini) can do OCR-style tasks under their existing key — separate OCR services matter for high-volume document workflows.

Premium voice synthesis (production)

Optional. Production apps use phone-native TTS by default (TextToSpeech on Android, AVSpeechSynthesizer on iOS) at zero cost. If the developer wants higher-quality voices:

ProviderPortalEnv var
ElevenLabselevenlabs.io/app/settings/api-keysELEVENLABS_API_KEY
Azure Neural TTSportal.azure.com → Speech resourceAZURE_SPEECH_KEY + region
Play.htplay.ht/studio/api-accessPLAYHT_API_KEY + user ID

Wire premium voice via ExtentosConfig.premiumVoice in the library config — see the configuration reference. Extentos doesn't sit in the runtime path; the library calls the provider directly from the phone.

Meta Developer Center (real hardware)

Skip this until the developer is ready to test on physical Meta Ray-Ban glasses. The browser simulator and LocalSimTransport don't need any of it.

When real hardware comes into play:

  1. Create a Meta Developer account at developers.meta.com.

  2. Register a Wearables app under the developer's account.

  3. Bind the app to platform-specific identifiers:

    • Android: package name + release signing SHA-256.
    • iOS: Bundle ID + Team ID + a custom URL scheme of the developer's choosing.
  4. Retrieve the Meta App ID and Client Token from the Meta Developer Center dashboard.

  5. Pass them to the library via ExtentosConfig:

    Android (Kotlin)
    ExtentosConfig(
        metaCredentials = MetaCredentials(
            appId = BuildConfig.META_APP_ID,
            clientToken = BuildConfig.META_CLIENT_TOKEN,
        ),
    )
    iOS (Swift)
    Wearables.configure(
        metaAppID: Bundle.main.infoDictionary!["MetaAppID"] as! String,
        clientToken: Bundle.main.infoDictionary!["MetaClientToken"] as! String
    )
  6. Read the values from local.properties (Android dev) or Info.plist (iOS) — never hardcode them in source.

The MCP getCredentialGuide tool walks through this interactively with the developer when they call it — that's the agent-facing entry point if step-by-step is preferred over this static reference.

Voice in the simulator (no keys needed)

The browser simulator's STT and TTS run on Extentos-managed infrastructure (the voice proxy described in accounts and pricing). Developers don't sign up for Whisper, OpenAI, or any voice provider to use voice in the simulator — that's deliberate, so the first speak/listen experience works out of the box.

Voice usage is metered through the regular 1000-event anonymous meter — there's no separate voice quota. On real hardware, voice runs entirely on the phone (zero Extentos cost, zero provider keys).

Where to put keys

Never check provider keys into source control.

  • Android: local.properties for dev, secure CI/release config (e.g., environment variables on the build runner) for shipping. Generated handler templates have // TODO: load from BuildConfig comments at the right spot.
  • iOS: .xcconfig files (gitignored) or environment variables on the build runner. Generated handlers mark the spot with // TODO: load from Bundle.main.infoDictionary.
  • Local dev: developer's shell or a .env file (gitignored).

For real production deployments, prefer build-time secret injection over runtime env vars where possible — it keeps the keys out of the running app's process memory.

When to surface this

If you're an AI agent scaffolding a spec for a developer:

  • At spec generation time, look at which capabilities the spec uses. Surface the matching rows from the Quick decision table inline so the developer knows what they'll need before they get deep into handler code.
  • At handler scaffolding time, embed // TODO comments at the env-var read sites pointing to the provider portal URL.
  • Before the first real-hardware test, surface the Meta Developer Center section as a separate setup step — it's a longer process (~15-30 min) and the developer wants to do it once, not interleaved with handler code.

Don't dump the full table on a developer who's still in the simulator; it adds noise. The simulator path is meant to need no keys at all.

  • Auth — Extentos account auth, separate from provider keys
  • Configuration — env vars and config-file settings
  • Production toolsgetCredentialGuide, getProductionChecklist
  • Pricing — what the free tier covers