Errors & Troubleshooting

Common errors from the proxy and how to resolve them.

HTTP Error Codes

400 — Bad Request

  • "Invalid JSON" — Request body is not valid JSON.
  • "Missing model in request" — No model field in request body.
  • "Missing or invalid messages array in request" — The messages field is missing or not an array.
  • "Model not found" — The requested model name couldn't be resolved. Check the suggestions field in the response for similar model names.
  • "Native /v1/messages only supports Anthropic models" — You sent a non-Anthropic model to the /v1/messages endpoint. Use /v1/chat/completions for cross-provider routing.

401 — Unauthorized

  • "Missing Anthropic authentication" — No Authorization header, x-api-key, or ANTHROPIC_API_KEY env var found.

429 — Rate Limited / Budget Exceeded

  • rate_limit_exceeded — Too many requests per minute to a model. Check the Retry-After header.
  • budget_exceeded — Daily or hourly budget limit reached. Configure limits in the budget config section.

500 — Internal Error

  • "Missing API_KEY environment variable" — Provider API key not set. Set the appropriate *_API_KEY env var.
  • "Provider error: ..." — Upstream provider returned an error. Check the error message for details.
  • "All cascade models exhausted" — Every model in the cascade chain failed.
  • "Request body too large" — Request body exceeds 10MB limit.

503 — Service Unavailable

  • "Provider is temporarily cooled down" — The target provider has had too many recent failures and is in cooldown. Wait for the cooldown to expire (default: 120 seconds).

Common Issues

Connection Refused

If you get ECONNREFUSED, the proxy isn't running. Start it:

1relayplane start

Model Mismatch Warning

The log message ⚠️ Model mismatch: requested "X" but response contains "Y" means the upstream provider returned a different model than requested. This is informational and usually harmless (e.g., when a provider aliases models).

OAuth Doesn't Work for Routed Models

Anthropic OAuth tokens (sk-ant-oat*) may not work for all models. The proxy automatically falls back to the API key for rerouted requests when OAuth is detected. Set ANTHROPIC_API_KEY as a fallback.

Streaming Issues

Cascade mode doesn't work with streaming requests — it automatically falls back to complexity-based routing. This is by design, since cascade requires reading the full response to detect uncertainty.

Debug Tips

  • Start the proxy with --verbose for detailed routing logs
  • Check x-relayplane-* response headers to see routing decisions
  • Visit http://localhost:4801/dashboard for real-time metrics
  • Check provider health: curl http://localhost:4801/v1/telemetry/health
Quick health check: curl http://localhost:4801/health — if this returns {"status":"ok"}, the proxy is running fine.