Use this file to discover all available pages before exploring further.
This guide walks you through building a minimal but fully functional Clanka agent. You will configure a provider, compose Effect layers, send a prompt, and stream formatted output to your terminal — all based on the patterns in the official examples/cli.ts.
Clanka requires Effect and several @effect/* peer packages. See the installation guide for exact version ranges and tsconfig requirements before continuing.
1
Install Clanka and peer dependencies
Install clanka and its required peer packages. The example uses pnpm, but npm and yarn work the same way.
The Codex provider authenticates through the ChatGPT web API. Build the shared model services layer by composing Codex.layerClient with the Node.js platform services and a persistent key-value store for credentials.
import { Layer } from "effect"import { Codex } from "clanka"import { NodeHttpClient, NodeServices, NodeSocket,} from "@effect/platform-node"import { KeyValueStore } from "effect/unstable/persistence"import * as NodePath from "node:path"const XDG_CONFIG_HOME = process.env.XDG_CONFIG_HOME || NodePath.join(process.env.HOME || "", ".config")const ModelServices = Codex.layerClient.pipe( Layer.provide( KeyValueStore.layerFileSystem(NodePath.join(XDG_CONFIG_HOME, "clanka")), ), Layer.provideMerge(NodeServices.layer), Layer.provideMerge(NodeHttpClient.layerUndici), Layer.merge(NodeSocket.layerWebSocketConstructorWS),)
3
Choose a model
Use Codex.modelWebSocket to create a model layer. The model layer provides the LanguageModel service that the agent consumes. Pass ModelServices to satisfy its dependencies.
Agent.layerLocal wires together the Agent service, the AgentExecutor that manages the sandbox, and the Node.js filesystem and process services. Point it at the directory the agent should work in — typically process.cwd().
import { Agent } from "clanka"import { NodeHttpClient, NodeServices } from "@effect/platform-node"import { Layer } from "effect"const AgentLayer = Agent.layerLocal({ directory: process.cwd(),}).pipe( Layer.provide(NodeServices.layer), Layer.provide(NodeHttpClient.layerUndici),)
5
Send a prompt and stream output
Use Effect.gen to access the Agent service, call agent.send() with your prompt, and pipe the resulting stream through OutputFormatter.pretty() to get human-readable terminal output.
import { Effect, Stream } from "effect"import { Agent, OutputFormatter } from "clanka"import { NodeRuntime } from "@effect/platform-node"Effect.gen(function* () { const agent = yield* Agent.Agent const output = yield* agent.send({ prompt: process.argv.slice(2).join(" "), }) yield* output.pipe( OutputFormatter.pretty(), Stream.runForEachArray((chunk) => { for (const out of chunk) { process.stdout.write(out) } return Effect.void }), )}).pipe( Effect.scoped, Effect.provide([AgentLayer, Gpt54, Agent.layerSubagentModel(SubAgentModel)]), NodeRuntime.runMain,)
6
Run your agent
Compile the file with tsc or run it directly with a TypeScript runner, passing your task as command-line arguments.
# Pass a prompt directly (non-interactive)npx tsx agent.ts "Refactor the fetchUser function to use async/await"
The agent prints reasoning steps, the generated JavaScript, sandbox output, and a final task-complete summary as it works through your request.