Aider is an open-source AI coding assistant with 1.9M+ GitHub stars. It runs in your terminal, edits files directly, and commits changes with git. You describe what you want; aider writes the code.
Aider uses the OpenAI API format, so it respects the OPENAI_API_BASE environment variable. Set it to your RelayPlane proxy URL and every aider request routes through RelayPlane for full cost visibility.
One environment variable. No code changes to aider.
export OPENAI_API_BASE=http://localhost:4100npx @relayplane/proxy --port 4100RelayPlane runs locally. No cloud dependency. No data leaves your machine without opt-in.
OPENAI_API_BASE=http://localhost:4100Set the environment variable before running aider. It treats the proxy as a standard OpenAI-compatible endpoint.
refactor -> sonnetEvery request is analyzed and routed to the optimal model. Costs are tracked per-request in local SQLite.
Route between any model seamlessly from aider
relayplane:autoInfers task type from prompt. Simple edits go to Haiku. Complex refactors go to Sonnet or Opus.
add imports -> haiku refactor -> sonnet
relayplane:costAlways routes to cheapest models. Maximum savings, escalates on failure.
everything -> haiku # escalate on error
relayplane:qualityUses best available model. Maximum quality, similar to no optimization.
everything -> opus # full quality mode
RelayPlane adds intelligent routing and observability to every aider request.
relayplane stats --days 7See exactly what each aider session costs
relayplane stats --breakdownCost by model, by task type, by hour
cascade: haiku -> sonnet -> opusTry cheap models first, escalate on uncertainty
provider cooldowns: autoFailing providers are paused, alternatives used
~/.relayplane/data.dbAll logs stored locally in SQLite
telemetry: off (default)No data sent anywhere without opt-in
POST /v1/chat/completionsStandard OpenAI endpoint, no aider config changes
streaming: supportedFull SSE streaming passthrough
Start it with verbose mode to see what is happening:
npx @relayplane/proxy --port 4100 -v
Confirm OPENAI_API_BASE is set in your current shell session:
echo $OPENAI_API_BASE # Should output: http://localhost:4100
Make sure your API key is set. Aider passes it through the proxy to the upstream provider:
export OPENAI_API_KEY="sk-..."
Check the stats endpoint while aider is running:
curl http://localhost:4100/control/stats
Unset the environment variable for that session or run aider without it:
OPENAI_API_BASE= aider --model gpt-4