Route OpenClaw through Portkey for observability, cost tracking, and reliability
OpenClaw is an open-source AI assistant with persistent memory and multi-platform access. Routing it through Portkey gives you request logs, cost tracking, automatic failovers, and team controls.
Already have OpenClaw set up? Start from step 3. Already have Portkey configured? Re‑use your existing provider slug and API key.
1. Set up OpenClaw (if not already installed)1.1 Install OpenClawRun one of the following:
Copy
Ask AI
# macOS or Linuxnpm install -g openclaw@latest# macOS or Linux (curl installer)curl -fsSL https://openclaw.ai/install.sh | bash# Windows (PowerShell)iwr -useb https://openclaw.ai/install.ps1 | iex
1.2 Run the onboarding wizard
Copy
Ask AI
openclaw onboard --install-daemon
Follow the onboarding wizard according to your needs.
If you’re unsure, you can refer this minimal QuickStart path:
Copy
Ask AI
🦞 OpenClaw 2026.2.15 — onboarding (QuickStart)- Acknowledge the security notice: - Choose: “Yes, I understand this is powerful and inherently risky.”- Onboarding mode: - Choose: QuickStart- Model/auth provider: - Choose: OpenAI - Auth method: OpenAI API key - Paste your OpenAI API key when prompted - Keep default model: openai/gpt-5.1-codex (or similar default suggested)- Channels (QuickStart): - Choose: Skip for now (you can add channels later via `openclaw channels add`)- Skills: - Choose: No / Skip for now- Hooks: - Select: Skip for now- Gateway service: - Let it install the Gateway service (LaunchAgent on macOS)- Hatch your bot: - Choose: Do this later- When you’re ready: - Dashboard: run `openclaw dashboard --no-open` - Control UI: follow the printed `http://127.0.0.1:18789/...` link
This completes the basic OpenClaw setup.2. Make sure OpenClaw is runningOnce onboarding is complete, ensure the Gateway service is up (or start it via your OS tools or openclaw commands). When OpenClaw is set up and running, continue with Portkey integration below.3. Add your provider to PortkeyOpen Model Catalog, click Add Provider, enter your provider API key, and create a slug (for example, gemini).4. Create a Portkey API keyGo to API Keys, click Create, and copy the key.5. Open your OpenClaw configEdit the OpenClaw config file directly:
Copy
Ask AI
open ~/.openclaw/openclaw.json
6. Add the Portkey provider and agent modelIn openclaw.json, add or merge the following snippet:
Replace YOUR_PORTKEY_API_KEY with your actual Portkey API key.
Remember to keep "api" set to "openai-completions" or "openai-responses" — Portkey transforms requests based on the OpenAI API format, and other values from the OpenClaw docs will be incompatible here.7. Test the integrationAfter saving the config, run:
Copy
Ask AI
openclaw agent --agent main --message "hi"
If everything is configured correctly, this message will be routed through Portkey to your configured @gemini/gemini-2.5-flash model.For more advanced configuration options, see the OpenClaw docs:
Add multiple models from any provider you’ve configured in Portkey.
You can then refer to them from your agent config using primary and fallbacks.
If you only want to use a single model, set it in primary without having anything in fallbacks.
Primary traffic goes to portkey/@mistral/open-mixtral-8x7b.
Fallback traffic uses portkey/@gemini/gemini-2.5-flash when the primary fails.
If you want to use just one model at a time, set only the primary field.
Create configs in Portkey Configs and attach them to your API key.
From the Portkey dashboard, open Configs, create a config, and copy its Config ID from the config details page.You can then pass this Config ID from OpenClaw by adding a header in your openclaw.json:
Copy
Ask AI
"headers": { "x-portkey-config": "pc-config-id"}
If your Portkey config is responsible for switching between providers or models based on conditions (e.g., latency, cost, availability), you can point OpenClaw at a virtual model and let Portkey handle the routing:
Here, portkey-dynamic is a non-existent model in Portkey’s Model Catalog. Instead, your Portkey Config (referenced by x-portkey-config) decides which real provider/model to call.
Once you’ve wired in the Config ID header, you can iterate on different config JSONs in the Portkey UI (failovers, retries, routing rules, etc.) without changing your OpenClaw setup.Below are example config payloads you might use inside Portkey:
When deploying to a team, attach configs to API keys so developers get reliability and cost controls automatically.
Create a config with fallbacks, caching, retries, and guardrails
Create an API key and attach the config
Distribute the key to developers
Developers use a simple config — all routing and reliability logic is handled by the attached config. When you update the config, changes apply immediately.