Windsurf is the IDE from Codeium built around AI-first workflows. With BYOK support, you bring your own API keys and route calls directly through any compatible endpoint.
Windsurf uses an OpenAI-compatible base URL setting, making it straightforward to point at RelayPlane. Every tab completion, inline edit, and chat request routes through your local proxy for full cost visibility.
Works with Windsurf's built-in BYOK. No code changes required.
npx @relayplane/proxy --port 4801RelayPlane runs locally. No cloud dependency. No data leaves your machine without opt-in.
http://localhost:4801/v1Set the API Base URL in Windsurf AI Settings. Windsurf treats it as a standard OpenAI-compatible endpoint.
autocomplete -> haikuEvery request is analyzed and routed to the optimal model. Costs are tracked per-request in local SQLite.
Route between any model seamlessly from Windsurf
relayplane:autoInfers task type from prompt. Tab completions go to Haiku. Complex edits go to Sonnet or Opus.
autocomplete -> haiku refactor -> sonnet
relayplane:costAlways routes to cheapest models. Maximum savings, escalates on failure.
everything -> haiku # escalate on error
relayplane:qualityUses best available model. Maximum quality, similar to no optimization.
everything -> opus # full quality mode
RelayPlane adds intelligent routing and observability to every Windsurf request.
relayplane stats --days 7See exactly what each session costs
relayplane stats --breakdownCost by model, by task type, by hour
cascade: haiku -> sonnet -> opusTry cheap models first, escalate on uncertainty
provider cooldowns: autoFailing providers are paused, alternatives used
~/.relayplane/data.dbAll logs stored locally in SQLite
telemetry: off (default)No data sent anywhere without opt-in
POST /v1/chat/completionsStandard OpenAI endpoint, no Windsurf config changes
streaming: supportedFull SSE streaming passthrough
Start it with verbose mode to see what is happening:
npx @relayplane/proxy --port 4801 -v
Confirm the API Base URL in Windsurf Settings > AI Settings is set to:
http://localhost:4801/v1
Make sure your API key is set correctly in both Windsurf settings and the proxy:
export ANTHROPIC_API_KEY="sk-ant-..."
Check the stats endpoint while Windsurf is active:
curl http://localhost:4801/control/stats
Switch back to the original provider URL in Windsurf AI Settings, or stop the proxy process.
# Stop proxy: Ctrl+C in the terminal running the proxy