OpenClaw Reference (Mirrored)

GitHub Copilot

Mirrored from OpenClaw (MIT)
This mirror is provided for convenience. OpenClawdBots is not affiliated with or endorsed by OpenClaw.

GitHub Copilot

GitHub Copilot is GitHub's AI coding assistant. It provides access to Copilot models for your GitHub account and plan. OpenClaw can use Copilot as a model provider in two different ways.

Two ways to use Copilot in OpenClaw

Built-in provider (github-copilot)

Use the native device-login flow to obtain a GitHub token, then exchange it for Copilot API tokens when OpenClaw runs. This is the default and simplest path because it does not require VS Code.

  1. Run the login command
    openclaw models auth login-github-copilot
    

    You will be prompted to visit a URL and enter a one-time code. Keep the terminal open until it completes.

  2. Set a default model
    openclaw models set github-copilot/gpt-4o
    

    Or in config:

    {
      agents: { defaults: { model: { primary: "github-copilot/gpt-4o" } } },
    }
    
Copilot Proxy plugin (copilot-proxy)

Use the Copilot Proxy VS Code extension as a local bridge. OpenClaw talks to the proxy's /v1 endpoint and uses the model list you configure there.

NOTE

Choose this when you already run Copilot Proxy in VS Code or need to route through it. You must enable the plugin and keep the VS Code extension running.

Optional flags

FlagDescription
--yesSkip the confirmation prompt
--set-defaultAlso apply the provider's recommended default model
# Skip confirmation
openclaw models auth login-github-copilot --yes

# Login and set the default model in one step
openclaw models auth login --provider github-copilot --method device --set-default
Interactive TTY required

The device-login flow requires an interactive TTY. Run it directly in a terminal, not in a non-interactive script or CI pipeline.

Model availability depends on your plan

Copilot model availability depends on your GitHub plan. If a model is rejected, try another ID (for example github-copilot/gpt-4.1).

Transport selection

Claude model IDs use the Anthropic Messages transport automatically. GPT, o-series, and Gemini models keep the OpenAI Responses transport. OpenClaw selects the correct transport based on the model ref.

Environment variable resolution order

OpenClaw resolves Copilot auth from environment variables in the following priority order:

PriorityVariableNotes
1COPILOT_GITHUB_TOKENHighest priority, Copilot-specific
2GH_TOKENGitHub CLI token (fallback)
3GITHUB_TOKENStandard GitHub token (lowest)

When multiple variables are set, OpenClaw uses the highest-priority one. The device-login flow (openclaw models auth login-github-copilot) stores its token in the auth profile store and takes precedence over all environment variables.

Token storage

The login stores a GitHub token in the auth profile store and exchanges it for a Copilot API token when OpenClaw runs. You do not need to manage the token manually.

WARNING

Requires an interactive TTY. Run the login command directly in a terminal, not inside a headless script or CI job.