The Copilot provider connects Clanka to the GitHub Copilot API (Documentation Index
Fetch the complete documentation index at: https://mintlify.com/Effectful-Tech/clanka/llms.txt
Use this file to discover all available pages before exploring further.
https://api.githubcopilot.com) using GitHub’s OAuth device flow. No OpenAI account or API key is needed — authentication is tied to your GitHub account and any active Copilot subscription.
Authentication
GithubCopilotAuth implements the GitHub OAuth device authorization flow:
- On first use, Clanka requests a device code from GitHub and calls your
DeviceCodeHandlerwith the verification URL (https://github.com/login/device) and the user code. - The user opens the URL in a browser, enters the code, and approves access.
- Clanka polls GitHub until authorization is granted, then stores the access token locally.
- The token is persisted in
KeyValueStoreunder the prefixgithub-copilot.auth/. On the next run, the stored token is loaded from disk.
KeyValueStore.layerFileSystem) is ~/.config/clanka.
Unlike the Codex provider, GitHub Copilot access tokens do not have a refresh token. If the stored token expires, Clanka will automatically re-run the device flow.
Displaying the verification prompt
DeviceCodeHandler.layerConsole is a built-in layer that prints the verification URL and user code to stdout:
layerConsole with a custom DeviceCodeHandler layer if you want to display the prompt differently — for example, opening the browser automatically or showing a UI dialog.
The DeviceCodeHandler service interface is:
Setting up the client layer
Copilot.layerClient wires together the OpenAiClient (from @effect/ai-openai-compat) and GithubCopilotAuth. Provide it with HTTP client and file system services, plus a DeviceCodeHandler:
The Copilot provider uses
@effect/ai-openai-compat for API compatibility with the GitHub Copilot endpoint. This package provides the same OpenAiClient and OpenAiLanguageModel interfaces as @effect/ai-openai, so the programming model is identical.Model factory
Copilot.model creates a Model.Model value from a model name string and optional options:
claude-opus-4.6, claude-sonnet-4.5, gpt-4o, and others depending on your Copilot plan.
Model options
Theoptions object accepts any field from OpenAiLanguageModel.Config["Service"]. Provider-specific parameters (such as thinking for Claude models) can be included and are forwarded as-is to the API.
| Field | Description |
|---|---|
thinking.thinking_budget | For Claude models: sets the extended thinking token budget. |
| Other OpenAI-compatible fields | Forwarded directly to the GitHub Copilot API. |
Creating a model layer
Layer.provideMerge is used here (instead of Layer.provide) so that ModelServices remains available in the output layer — useful when you want to reuse the same ModelServices for other models or services in the same program.
Complete example
The following shows the Copilot model setup fromexamples/cli.ts, alongside a Codex primary model, demonstrating that both providers can be used in the same program: