Skip to content

fix: rewrite Copilot driver with OAuth device flow authentication#1017

Open
dmbutko wants to merge 2 commits intoRightNow-AI:mainfrom
dmbutko:fix/copilot-oauth-device-flow
Open

fix: rewrite Copilot driver with OAuth device flow authentication#1017
dmbutko wants to merge 2 commits intoRightNow-AI:mainfrom
dmbutko:fix/copilot-oauth-device-flow

Conversation

@dmbutko
Copy link
Copy Markdown

@dmbutko dmbutko commented Apr 9, 2026

Summary

Complete rewrite of the GitHub Copilot LLM driver to fix broken authentication and add proper OAuth device flow support.

Before: The driver expected a GITHUB_TOKEN env var, but no standard token type (PAT, fine-grained PAT, gh CLI token) works with the Copilot token exchange endpoint. Users couldn't use Copilot at all.

After: Zero-config authentication via OAuth device flow. User runs openfang config set-key github-copilot or selects Copilot during openfang init, authorizes in their browser, and it just works.

What changed

Driver (copilot.rs) — full rewrite

  • OAuth device flow using Copilot's client ID (Iv1.b507a08c87ecfe98)
  • Token chain: ghu_ access token → Copilot API token (tid=...), both cached automatically
  • Token persistence to ~/.openfang/.copilot-tokens.json (0600 permissions)
  • Proper Editor-Version / Editor-Plugin-Version / Copilot-Integration-Id headers
  • Dynamic base URL from token exchange response (endpoints.api) — works with Individual, Business, and Enterprise plans

Dynamic model catalog

  • Models fetched from {endpoints.api}/models on daemon startup
  • Refreshed on model_not_supported error (retry once with updated list)
  • Removed hardcoded copilot/gpt-4o and copilot/gpt-4 static entries
  • Uses merge_discovered_models() to populate catalog at runtime

Init wizard (init_wizard.rs)

  • New CopilotAuth TUI step between provider selection and model picker
  • Shows device code + verification URL with spinner
  • [Enter] to open browser (doesn't auto-open)
  • After auth succeeds, fetches live model list and auto-advances to model picker
  • User sees all available models (43 in Enterprise) with real-time data

CLI integration

  • openfang config set-key github-copilot — runs interactive device flow
  • openfang doctor — checks for Copilot auth token file, shows ✔ if authenticated
  • Simplified driver instantiation in mod.rs — no env vars needed

Model catalog cleanup

  • Removed 7 static Copilot model entries (replaced by dynamic fetch)
  • Updated aliases (copilotgpt-4o)
  • Copilot auth detection via token file instead of env var

Testing

Tested end-to-end on Windows (x64 cross-compiled) with GitHub Copilot Enterprise:

Test Result
Device flow authentication
Token exchange (copilot_internal/v2/token) ✔ 200 OK
Live model fetch ✔ 43 models
Chat completion (claude-opus-4.6-1m)
openfang init full wizard flow
openfang config set-key github-copilot
openfang doctor ✔ "All checks passed"
openfang start (daemon boot with model fetch)
/model switcher in chat TUI ✔ shows live models

Why not use the Copilot SDK?

The official @github/copilot-sdk is TypeScript/Node and spawns the Copilot CLI as a subprocess. OpenFang is Rust and only needs Copilot as a completion endpoint. Direct API integration is simpler and has no Node.js dependency.

Why this client ID?

Iv1.b507a08c87ecfe98 is the VS Code Copilot extension's OAuth App client ID. The copilot_internal/v2/token endpoint only accepts ghu_ tokens from this specific app — custom OAuth Apps produce gho_ tokens which are rejected with 404. This is the same approach used by shell-ask, aider, continue.dev, and every other non-SDK Copilot integration.

Closes #1014

Dmitry Butko and others added 2 commits April 9, 2026 13:01
The Copilot LLM driver was broken - it expected users to provide a
GITHUB_TOKEN env var, but no standard token type (PAT, gh CLI token)
works with the Copilot token exchange endpoint.

Changes:
- Full rewrite of copilot.rs with OAuth device flow using Copilot's
  client ID (Iv1.b507a08c87ecfe98)
- Three-layer token chain: ghu_ (8h) -> Copilot API token (30min),
  with automatic caching and refresh
- Dynamic model fetching from Copilot API on daemon startup and on
  model_not_supported error
- Init wizard: TUI auth screen with device code display, live model
  picker after authentication
- set-key command: interactive device flow for github-copilot provider
- Doctor: detects Copilot auth via persisted token file
- Removed static Copilot model entries (now fetched dynamically)
- Simplified driver instantiation (no env vars needed)

Tested end-to-end with Copilot Enterprise: auth, token exchange,
43 models fetched, completions working with claude-opus-4.6-1m.

Closes RightNow-AI#1014

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
…pilot proxy

The Copilot API proxy rejects conversations ending with an empty
assistant message as unsupported 'assistant message prefill' when
proxying Claude and Gemini models. GPT models are unaffected.

Strips trailing empty assistant messages (no content, no tool calls)
before sending the request. Applied in both complete() and stream()
paths.

Also reverts unused fixup_request method from copilot.rs since the
fix belongs in the OpenAI driver layer.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Copilot driver auth is broken - requires OAuth device flow token, not PAT

1 participant