WINDSURF BYOK
ROUTING ACTIVE
WINDSURF INTEGRATION
CODEIUM BYOKSMART ROUTING

RelayPlane
+ Windsurf
Save 50%

Codeium's AI-powered IDE supports BYOK via a custom API base URL. Route Windsurf's LLM calls through RelayPlane for cost tracking and intelligent model routing.

$npx @relayplane/proxy --port 4801
WINDSURF SESSION LOGROUTING
[09:12:01] Tab autocomplete -> haiku
[09:12:02] Inline suggestion -> haiku
[09:12:05] Explain selection -> haiku
[09:12:08] Refactor function -> sonnet
[09:12:11] Fix bug -> sonnet
[09:12:14] Cascade review -> haiku
[09:12:17] Architect new module -> opus
[09:12:20] Write tests -> haiku
✓ Session complete8 calls · $0.31 (was $1.92)
ALL OPUS$15.00/1M
WITH RELAYPLANEavg $0.85/1M
// ABOUT WINDSURF

Codeium's AI-Powered IDE

Windsurf is the IDE from Codeium built around AI-first workflows. With BYOK support, you bring your own API keys and route calls directly through any compatible endpoint.

Windsurf uses an OpenAI-compatible base URL setting, making it straightforward to point at RelayPlane. Every tab completion, inline edit, and chat request routes through your local proxy for full cost visibility.

TYPICAL WINDSURF SESSION COSTS
Quick fix (20 calls)
$2.40$1.20
Debug session (80 calls)
$9.60$4.80
Feature build (200 calls)
$24.00$12.00
MONTHLY (heavy use)
$600+$300
50%
Average cost reduction
200+
API calls per heavy session
<1ms
Routing decision latency
// QUICK START

Two Steps to Start Saving

Works with Windsurf's built-in BYOK. No code changes required.

~/projects/my-project
# Step 1: Start the RelayPlane proxy
npx @relayplane/proxy --port 4801
# Step 2: In Windsurf Settings > AI Settings, set API Base URL
SettingValue
API Base URLhttp://localhost:4801/v1
API Keyyour-anthropic-key
✓ All Windsurf LLM calls now route through RelayPlane
WINDSURF SETTINGS PATH
1. Open Windsurf
2. Go to Settings (gear icon or Cmd/Ctrl+,)
3. Navigate to AI Settings
4. Set API Base URL to http://localhost:4801/v1
5. Save and restart Windsurf if prompted
// HOW IT WORKS

Transparent Proxy Architecture

01

Start the Proxy

npx @relayplane/proxy --port 4801

RelayPlane runs locally. No cloud dependency. No data leaves your machine without opt-in.

02

Point Windsurf at It

http://localhost:4801/v1

Set the API Base URL in Windsurf AI Settings. Windsurf treats it as a standard OpenAI-compatible endpoint.

03

Requests Route Intelligently

autocomplete -> haiku

Every request is analyzed and routed to the optimal model. Costs are tracked per-request in local SQLite.

5
PROVIDERS
50%
AVG. SAVINGS
<1
MS ROUTING

Five Providers. One Interface.

Route between any model seamlessly from Windsurf

Anthropic
Claude 4.5 (Opus, Sonnet, Haiku)
OpenAI
GPT-5.2, o1, o3
Google
Gemini 2.0, 1.5
xAI
Grok-3, Grok-3-mini
Moonshot
v1-8k to 128k
// ROUTING MODES

Choose Your Strategy

DEFAULT

Smart Routing

relayplane:auto

Infers task type from prompt. Tab completions go to Haiku. Complex edits go to Sonnet or Opus.

autocomplete -> haiku
refactor -> sonnet
AGGRESSIVE

Cost Priority

relayplane:cost

Always routes to cheapest models. Maximum savings, escalates on failure.

everything -> haiku
# escalate on error
CONSERVATIVE

Quality Priority

relayplane:quality

Uses best available model. Maximum quality, similar to no optimization.

everything -> opus
# full quality mode
// BENEFITS

What You Get

RelayPlane adds intelligent routing and observability to every Windsurf request.

Per-Request Cost Tracking

relayplane stats --days 7

See exactly what each session costs

relayplane stats --breakdown

Cost by model, by task type, by hour

Automatic Model Fallback

cascade: haiku -> sonnet -> opus

Try cheap models first, escalate on uncertainty

provider cooldowns: auto

Failing providers are paused, alternatives used

Local-First Privacy

~/.relayplane/data.db

All logs stored locally in SQLite

telemetry: off (default)

No data sent anywhere without opt-in

OpenAI-Compatible

POST /v1/chat/completions

Standard OpenAI endpoint, no Windsurf config changes

streaming: supported

Full SSE streaming passthrough

// TROUBLESHOOTING

Common Issues

Proxy not running?

Start it with verbose mode to see what is happening:

npx @relayplane/proxy --port 4801 -v

Windsurf not routing through proxy?

Confirm the API Base URL in Windsurf Settings > AI Settings is set to:

http://localhost:4801/v1

Getting authentication errors?

Make sure your API key is set correctly in both Windsurf settings and the proxy:

export ANTHROPIC_API_KEY="sk-ant-..."

How do I verify routing is working?

Check the stats endpoint while Windsurf is active:

curl http://localhost:4801/control/stats

Want to bypass the proxy temporarily?

Switch back to the original provider URL in Windsurf AI Settings, or stop the proxy process.

# Stop proxy: Ctrl+C in the terminal running the proxy

Start Saving Today

Two settings. No cloud dependency. Instant cost reduction in Windsurf.

100% OPEN SOURCEMIT LICENSEWINDSURF COMPATIBLE