OpenClaw Reference (Mirrored)

OpenRouter

Mirrored from OpenClaw (MIT)
This mirror is provided for convenience. OpenClawdBots is not affiliated with or endorsed by OpenClaw.

OpenRouter

OpenRouter provides a unified API that routes requests to many models behind a single endpoint and API key. It is OpenAI-compatible, so most OpenAI SDKs work by switching the base URL.

Getting started

  1. Get your API key

    Create an API key at openrouter.ai/keys.

  2. Run onboarding
    openclaw onboard --auth-choice openrouter-api-key
    
  3. (Optional) Switch to a specific model

    Onboarding defaults to openrouter/auto. Pick a concrete model later:

    openclaw models set openrouter/<provider>/<model>
    

Config example

{
  env: { OPENROUTER_API_KEY: "sk-or-..." },
  agents: {
    defaults: {
      model: { primary: "openrouter/auto" },
    },
  },
}

Model references

NOTE

Model refs follow the pattern openrouter/<provider>/<model>. For the full list of available providers and models, see /concepts/model-providers.

Authentication and headers

OpenRouter uses a Bearer token with your API key under the hood.

On real OpenRouter requests (https://openrouter.ai/api/v1), OpenClaw also adds OpenRouter's documented app-attribution headers:

HeaderValue
HTTP-Refererhttps://openclaw.ai
X-OpenRouter-TitleOpenClaw
X-OpenRouter-Categoriescli-agent
WARNING

If you repoint the OpenRouter provider at some other proxy or base URL, OpenClaw does not inject those OpenRouter-specific headers or Anthropic cache markers.

Advanced notes

Anthropic cache markers

On verified OpenRouter routes, Anthropic model refs keep the OpenRouter-specific Anthropic cache_control markers that OpenClaw uses for better prompt-cache reuse on system/developer prompt blocks.

Thinking / reasoning injection

On supported non-auto routes, OpenClaw maps the selected thinking level to OpenRouter proxy reasoning payloads. Unsupported model hints and openrouter/auto skip that reasoning injection.

OpenAI-only request shaping

OpenRouter still runs through the proxy-style OpenAI-compatible path, so native OpenAI-only request shaping such as serviceTier, Responses store, OpenAI reasoning-compat payloads, and prompt-cache hints is not forwarded.

Gemini-backed routes

Gemini-backed OpenRouter refs stay on the proxy-Gemini path: OpenClaw keeps Gemini thought-signature sanitation there, but does not enable native Gemini replay validation or bootstrap rewrites.

Provider routing metadata

If you pass OpenRouter provider routing under model params, OpenClaw forwards it as OpenRouter routing metadata before the shared stream wrappers run.