Local-First AI: Keep Your Code Private

Your code is your IP. RelayPlane runs entirely on your machine — no cloud relay, no third-party data access. Route AI requests through a local proxy that you control.

Why Local-First Matters

Privacy

No intermediary sees your prompts or code. Direct connections to AI providers only.

Speed

No extra network hop. Local proxy adds <1ms latency vs cloud routing services.

Control

Your rules, your models, your budget limits. No vendor lock-in. MIT licensed.

How It Works

RelayPlane is a lightweight proxy that runs on localhost:4000. Point your AI tools at it instead of directly at Anthropic/OpenAI. The proxy:

  • • Analyzes each request to pick the optimal model
  • • Connects directly to AI providers (your API keys, your accounts)
  • • Logs usage and costs locally. Telemetry is on by default (only anonymous metadata). Your prompts go directly to providers.
  • • Works with OpenClaw, Cursor, Continue, and any OpenAI-compatible tool

Enterprise & Compliance

For teams with strict data policies, RelayPlane ensures compliance by design. No prompts or responses are stored externally. Audit logs stay on your infrastructure. Works behind corporate firewalls and VPNs. SOC 2 and HIPAA-friendly architecture.

Try RelayPlane Locally

One command to install. Your data stays yours.

Get Started