Frequently Asked Questions

Common questions about RelayPlane, answered.

Installation & Setup

How do I install RelayPlane?

Install globally via npm, then initialize and start:

1npm install -g @relayplane/proxy
2relayplane init
3relayplane start

What environment variables do I need?

Set the API keys for the providers you use:

1export ANTHROPIC_API_KEY=sk-ant-...
2export OPENAI_API_KEY=sk-...
3export GEMINI_API_KEY=...

How do I connect my AI tools to RelayPlane?

Point your tools to the proxy URL. The proxy runs at http://localhost:4100 by default:

1# For Anthropic tools
2export ANTHROPIC_BASE_URL=http://localhost:4100
3
4# For OpenAI tools
5export OPENAI_BASE_URL=http://localhost:4100

Can I use RelayPlane with Claude Code / Cursor / Aider?

Yes! Any tool that uses the Anthropic or OpenAI API can be routed through RelayPlane. Just set the appropriate *_BASE_URL environment variable before starting your tool.

Privacy & Security

Does RelayPlane see my prompts?

No. The proxy runs locally on your machine. Your prompts go directly to LLM providers. Your responses go directly to the AI providers. RelayPlane only sees metadata (token counts, model used, latency), never the content of your messages.

What telemetry is collected?

Only anonymous metadata to improve routing:

  • Anonymous device ID (random, not fingerprintable)
  • Task type (inferred from token patterns, not content)
  • Model used, token counts, latency, cost

Use relayplane --audit to see exactly what's collected before sending.

How do I disable telemetry?

1# Disable telemetry completely
2relayplane telemetry off
3
4# Or run in offline mode (no network calls except to LLM providers)
5relayplane --offline

Is my data stored anywhere?

Usage data is stored locally in ~/.relayplane/. Telemetry is on by default: anonymous aggregated stats are sent to improve routing. Your prompts go directly to LLM providers. Your responses are never stored or transmitted. Disable telemetry: relayplane telemetry off

Pricing & Billing

Is the free tier really free?

Yes! The free tier includes static model routing and local cost tracking. No credit card required, no time limit. Use it forever.

What do I get with Pro ($29/month)?

Pro adds configurable task-aware routing via the Relay Network:

  • ML-powered model selection optimized for your usage patterns
  • 30-day history and analytics in the dashboard
  • Unlimited Relay Network calls
  • Priority support

Most users save 40-60% on API costs. That's $47 saved on a typical $100/mo spend, well above the $19 Pro subscription.

Can I try Pro before paying?

Yes! Start a 7-day free trial. No credit card required. If you don't upgrade, you automatically revert to the free tier.

How do I cancel my subscription?

Go to Dashboard → Billing → Cancel Subscription. You'll keep Pro features until the end of your billing period, then revert to the free tier.

Troubleshooting

The proxy isn't starting. What's wrong?

Check these common issues:

  • Port 4100 may be in use. Try relayplane start --port 3002
  • Ensure Node.js 18+ is installed
  • Check that API keys are set in your environment

My requests are failing with 401 errors

This means the proxy is running but your API key is invalid or not set. Check that the API key for your provider is correctly exported in the terminal where the proxy is running: ANTHROPIC_API_KEY,OPENAI_API_KEY,GEMINI_API_KEY, orXAI_API_KEY.

How do I see my usage statistics?

1# View stats in terminal
2relayplane stats
3
4# Or visit your dashboard at relayplane.com/dashboard

Where are my config files stored?

All RelayPlane data is stored in ~/.relayplane/:

  • config.json — Settings and API key
  • telemetry.jsonl — Local telemetry data
  • stats.db — Usage statistics

Features

What is "configurable task-aware routing"?

RelayPlane analyzes each request and routes it to the optimal model based on task type, context length, and cost. For example, simple classification tasks might go to a faster, cheaper model, while complex coding tasks go to the most capable model.

How does the Relay Network know which model to use?

The Relay Network uses ML models trained on anonymized usage patterns to predict which model will perform best for each task type. It considers factors like token count, input/output ratio, tool usage, and historical success rates.

What models does RelayPlane support?

RelayPlane supports all major AI providers:

  • Anthropic: Claude Sonnet 4, Claude Opus 4, Claude 3.5 Haiku
  • OpenAI: GPT-4.1, GPT-4o, GPT-4o-mini, o3
  • Google: Gemini 2.5 Pro/Flash, Gemini 2.0 Flash
  • xAI: Grok 3
  • Local: Ollama models

Can I override the routing decision?

Yes. The Relay Network provides recommendations, but you can configure routing rules to always use specific models for certain task types, or exclude models entirely.

Still have questions? Reach out at help@relayplane.com