Now in public beta

Replace your .env file
with one token.

Encrypted vault, automatic rotation, and a smart proxy for every API your app talks to. One credential in, every provider out — your raw keys never leave the vault.

Before 14 secrets · plain text · scattered
# .env
OPENAI_API_KEY=sk-proj-••••••••••••••
ANTHROPIC_API_KEY=sk-ant-••••••••••••••
STRIPE_SECRET_KEY=sk_live_•••••••••••••
STRIPE_WEBHOOK_SECRET=whsec_••••••••••••
RESEND_API_KEY=re_••••••••••••••••••
SUPABASE_SERVICE_ROLE_KEY=eyJhb••••••••
SENTRY_DSN=https://•••••••••••••••
GITHUB_TOKEN=ghp_••••••••••••••••••
POSTHOG_API_KEY=phc_••••••••••••••••
CLOUDINARY_API_SECRET=••••••••••••
TWILIO_AUTH_TOKEN=••••••••••••••••••••
GOOGLE_CLIENT_SECRET=GOCSPX-••••••••••••
GROQ_API_KEY=gsk_•••••••••••••••••
MUX_TOKEN_SECRET=••••••••••••••••••••
After 1 token · encrypted · auto-rotating
# .env
APILOCKER_REFRESH_TOKEN=rtk_••••••••








# that's it.
# every provider above is reachable
# through one SDK call:
# await apilocker.proxy('openai', ...)
# await apilocker.proxy('stripe', ...)
# await apilocker.proxy('resend', ...)
Three pillars

One vault, three types of credentials.

Every secret a developer needs to keep safe, organized into categories that match how real apps are built.

LLM API Keys

OpenAI, Anthropic, Gemini, Groq, Mistral.

Keep your AI workloads secure while you ship. Inject model keys directly into your code with apilocker run — no more OpenAI keys in .env files or accidentally committed to git.

  • Zero-config provider templates for every major LLM
  • Rotate keys without redeploying
  • Per-key usage telemetry and anomaly alerts

Service API Keys

Stripe, Twilio, Resend, ElevenLabs, Sentry, and everything else.

Replace every .env file in your project with one token. 15+ provider templates come pre-configured; anything else works via "Custom." Your code still reads process.env.STRIPE_SECRET_KEY — just don't put it in a file anymore.

  • 19+ built-in provider templates, custom for the rest
  • Smart proxy route with audit logs on every call
  • Pause/resume individual keys during incident response

OAuth Credentials

Google, GitHub, Slack, Microsoft, Notion, Spotify, and more.

Store full OAuth credential sets — client ID, client secret, refresh token, scopes — as a single named credential. apilocker run injects every field as its own env var. Nango-quality OAuth management at solo-dev prices.

  • 14 OAuth provider templates with pre-filled endpoints
  • Multi-field encrypted storage, per-field reveal
  • One-command injection via apilocker run
Features

Built for how you ship.

A quiet layer between your apps and the APIs they call. Encrypted, observable, and invisible by default.

Encrypted vault

Your keys, locked behind device-bound auth.

Every credential is encrypted at rest with a master token bound to your machine. Rotate on a schedule, issue scoped tokens per project, and revoke instantly — without touching the underlying key.

vault.enc
openai
sk-proj-••••
anthropic
sk-ant-••••
stripe
sk_live_••••
supabase
eyJhbG••••
resend
re_••••
cloudflare
••••••
Smart proxy

Stream responses without exposing the key.

Your app hits the proxy. The proxy injects the real credential, forwards to the provider, and streams the response back — SSE and all. Your raw key never leaves the vault.

proxy.request
1 Receiving inbound request
2 Validating scoped_token
3 Injecting api_key from vault
4 Forwarding to provider
5 Streaming response
MCP ready

Built for AI agents from day one.

Expose your vault to Claude, Cursor, or any MCP-compatible client. Agents call real APIs without ever seeing the underlying credentials — and every call is logged, scoped, and revocable.

.mcp.json
{
  "mcpServers": {
    "apilocker": {
      "command": "apilocker",
      "args": ["mcp", "serve"],
      "scope": "openai,anthropic"
    }
  }
}

Your keys.
Your proxy.
Your control.

Free during beta — unlimited keys, unlimited tokens, unlimited calls. Sign up in 10 seconds.

$ npm install -g apilocker