API Reference
Endpoints implemented by the standalone proxy server.
LLM endpoints
POST /v1/messages— Anthropic Messages API passthrough/routing (streaming + non-streaming)POST /v1/messages/count_tokens— Anthropic token counting passthroughPOST /v1/chat/completions— OpenAI-compatible chat completions routingGET /v1/models— returns RelayPlane virtual models
Implementation note: model list and chat completions use URL substring matching (/models, /chat/completions).
Health & status
GET /statusGET /healthGET /healthz(alias of/health)
Control endpoints
POST /control/enablePOST /control/disableGET /control/statusGET /control/statsPOST /control/config(JSON merge patch)POST /control/kill(body:{"sessionKey":"..."}or{"all":true})
/control/* endpoints are administrative. Keep them bound to 127.0.0.1 by default, and only expose them with strong network/auth controls.
Telemetry endpoints
GET /v1/telemetry/stats(query:?days=7)GET /v1/telemetry/runs(query:?limit=50&offset=0)GET /v1/telemetry/savingsGET /v1/telemetry/health
Mesh + config + dashboard
GET /v1/mesh/statsPOST /v1/mesh/syncGET /v1/configGET /andGET /dashboardGET /dashboard/config
Security notes
Treat /control/*, /v1/config, and /dashboard/config as sensitive admin endpoints. Keep the proxy bound to 127.0.0.1 by default. If you must bind to 0.0.0.0, require a firewall, reverse proxy controls, and strong authentication before exposing it beyond localhost.
Example request
1curl -X POST http://localhost:4100/v1/chat/completions \2 -H "Content-Type: application/json" \3 -d '{4 "model": "relayplane:auto",5 "messages": [{"role":"user","content":"hello"}]6 }'Unknown routes return 404. The fallback error currently lists supported core endpoints:
POST /v1/messages, POST /v1/chat/completions, GET /v1/models.