Overview
In this quickstart, you’ll create a fully functional AI agent that can:
Respond to user queries
Use tools to perform calculations
Handle multi-turn conversations
By the end, you’ll understand the basics of creating agents, defining tools, and connecting to LLM providers.
Prerequisites
JDK 17 or higher
An API key from at least one LLM provider (OpenAI, Anthropic, Google, etc.)
Basic knowledge of Kotlin and coroutines
Step 1: Set Up Your API Key
First, you’ll need an API key from an LLM provider. We’ll use OpenAI in this example, but Koog supports multiple providers.
Get an API key
Sign up at OpenAI and generate an API key from your dashboard.
Set environment variable
Set your API key as an environment variable: export OPENAI_API_KEY = "your-api-key-here"
Or add it to your IDE’s run configuration.
Never hardcode API keys in your source code. Always use environment variables or secure configuration management.
Step 2: Create a Simple Agent
Let’s create your first agent - a helpful assistant that can answer questions.
Create a new file called Main.kt:
import ai.koog.agents.core.agent.AIAgent
import ai.koog.prompt.executor.llms.all.simpleOpenAIExecutor
import ai.koog.prompt.executor.clients.openai.OpenAIModels
suspend fun main () {
// Get API key from environment
val apiKey = System. getenv ( "OPENAI_API_KEY" )
?: error ( "Please set OPENAI_API_KEY environment variable" )
// Create a prompt executor (handles LLM communication)
simpleOpenAIExecutor (apiKey). use { executor ->
// Create the agent
val agent = AIAgent (
promptExecutor = executor,
systemPrompt = "You are a helpful assistant. Answer questions concisely." ,
llmModel = OpenAIModels.Chat.GPT4oMini
)
// Run the agent with a user query
val result = agent. run ( "What is Kotlin?" )
println ( "Agent: $result " )
}
}
Run this code:
You should see a response from the AI agent explaining what Kotlin is!
Let’s make the agent more powerful by giving it tools. We’ll create a calculator agent that can perform mathematical operations.
Create a new file CalculatorTools.kt:
import ai.koog.agents.core.tools.annotations.LLMDescription
import ai.koog.agents.core.tools.annotations.Tool
import ai.koog.agents.core.tools.reflect.ToolSet
@LLMDescription ( "Tools for basic calculator operations" )
class CalculatorTools : ToolSet {
@Tool
@LLMDescription ( "Adds two numbers" )
fun add (
@LLMDescription ( "First number" ) a: Double ,
@LLMDescription ( "Second number" ) b: Double
): String {
return (a + b). toString ()
}
@Tool
@LLMDescription ( "Subtracts second number from first" )
fun subtract (
@LLMDescription ( "First number" ) a: Double ,
@LLMDescription ( "Second number" ) b: Double
): String {
return (a - b). toString ()
}
@Tool
@LLMDescription ( "Multiplies two numbers" )
fun multiply (
@LLMDescription ( "First number" ) a: Double ,
@LLMDescription ( "Second number" ) b: Double
): String {
return (a * b). toString ()
}
@Tool
@LLMDescription ( "Divides first number by second" )
fun divide (
@LLMDescription ( "First number" ) a: Double ,
@LLMDescription ( "Second number" ) b: Double
): String {
if (b == 0.0 ) return "Error: Division by zero"
return (a / b). toString ()
}
}
Update your Main.kt:
import ai.koog.agents.core.agent.AIAgent
import ai.koog.agents.core.tools.ToolRegistry
import ai.koog.agents.core.tools.reflect.asTools
import ai.koog.prompt.executor.llms.all.simpleOpenAIExecutor
import ai.koog.prompt.executor.clients.openai.OpenAIModels
suspend fun main () {
val apiKey = System. getenv ( "OPENAI_API_KEY" )
?: error ( "Please set OPENAI_API_KEY environment variable" )
// Create a tool registry with calculator tools
val toolRegistry = ToolRegistry {
tools ( CalculatorTools (). asTools ())
}
simpleOpenAIExecutor (apiKey). use { executor ->
val agent = AIAgent (
promptExecutor = executor,
systemPrompt = "You are a helpful calculator. Use the available tools to solve math problems." ,
llmModel = OpenAIModels.Chat.GPT4oMini,
toolRegistry = toolRegistry
)
// Test the calculator agent
val result = agent. run ( "What is 25 multiplied by 4, then add 10?" )
println ( "Agent: $result " )
}
}
Run the code:
The agent will use the multiply and add tools to calculate the result: 110
Step 4: Add Event Handling
Let’s observe what the agent is doing by adding event handlers:
import ai.koog.agents.core.agent.AIAgent
import ai.koog.agents.core.tools.ToolRegistry
import ai.koog.agents.core.tools.reflect.asTools
import ai.koog.agents.features.eventHandler.feature.handleEvents
import ai.koog.prompt.executor.llms.all.simpleOpenAIExecutor
import ai.koog.prompt.executor.clients.openai.OpenAIModels
suspend fun main () {
val apiKey = System. getenv ( "OPENAI_API_KEY" )
?: error ( "Please set OPENAI_API_KEY environment variable" )
val toolRegistry = ToolRegistry {
tools ( CalculatorTools (). asTools ())
}
simpleOpenAIExecutor (apiKey). use { executor ->
val agent = AIAgent (
promptExecutor = executor,
systemPrompt = "You are a helpful calculator. Use the available tools to solve math problems." ,
llmModel = OpenAIModels.Chat.GPT4oMini,
toolRegistry = toolRegistry
) {
// Add event handlers to observe agent behavior
handleEvents {
onToolCallStarting { context ->
println ( "🔧 Using tool: ${ context.toolName } " )
println ( " Arguments: ${ context.toolArgs } " )
}
onToolCallCompleted { context ->
println ( "✅ Tool result: ${ context.result } " )
}
onAgentCompleted { context ->
println ( " \n ✨ Agent finished: ${ context.result } " )
}
}
}
println ( "=== Calculator Agent ===" )
agent. run ( "What is (15 + 25) multiplied by 2?" )
}
}
Now you’ll see detailed logs showing:
Which tools are being called
What arguments are being passed
What results are returned
Step 5: Try Different LLM Providers
Koog supports multiple LLM providers. Here’s how to use different ones:
OpenAI
Anthropic
Google Gemini
Ollama (Local)
OpenRouter
import ai.koog.prompt.executor.llms.all.simpleOpenAIExecutor
import ai.koog.prompt.executor.clients.openai.OpenAIModels
val apiKey = System. getenv ( "OPENAI_API_KEY" )
simpleOpenAIExecutor (apiKey). use { executor ->
val agent = AIAgent (
promptExecutor = executor,
llmModel = OpenAIModels.Chat.GPT4o, // or GPT4oMini, GPT35Turbo
systemPrompt = "You are a helpful assistant." ,
toolRegistry = toolRegistry
)
}
Step 6: Build an Interactive Chat Agent
Let’s create an interactive agent that can handle multi-turn conversations:
import ai.koog.agents.core.agent.AIAgent
import ai.koog.agents.core.tools.ToolRegistry
import ai.koog.agents.core.tools.reflect.asTools
import ai.koog.agents.features.eventHandler.feature.handleEvents
import ai.koog.prompt.executor.llms.all.simpleOpenAIExecutor
import ai.koog.prompt.executor.clients.openai.OpenAIModels
suspend fun main () {
val apiKey = System. getenv ( "OPENAI_API_KEY" )
?: error ( "Please set OPENAI_API_KEY environment variable" )
val toolRegistry = ToolRegistry {
tools ( CalculatorTools (). asTools ())
}
simpleOpenAIExecutor (apiKey). use { executor ->
val agent = AIAgent (
promptExecutor = executor,
systemPrompt = """
You are a helpful calculator assistant.
Use the available tools to solve math problems.
Be friendly and explain your reasoning.
""" . trimIndent (),
llmModel = OpenAIModels.Chat.GPT4oMini,
toolRegistry = toolRegistry
) {
handleEvents {
onToolCallStarting { context ->
println ( " [Using ${ context.toolName } ...]" )
}
}
}
println ( "=== Calculator Chat Agent ===" )
println ( "Type 'quit' to exit \n " )
while ( true ) {
print ( "You: " )
val input = readlnOrNull ()?. trim () ?: break
if (input. equals ( "quit" , ignoreCase = true )) {
println ( "Goodbye!" )
break
}
if (input. isEmpty ()) continue
print ( "Agent: " )
val response = agent. run (input)
println (response)
println ()
}
}
}
Now you have a fully interactive calculator agent that maintains conversation context!
Complete Example
Here’s a complete example that puts it all together:
import ai.koog.agents.core.agent.AIAgent
import ai.koog.agents.core.tools.ToolRegistry
import ai.koog.agents.core.tools.annotations.LLMDescription
import ai.koog.agents.core.tools.annotations.Tool
import ai.koog.agents.core.tools.reflect.ToolSet
import ai.koog.agents.core.tools.reflect.asTools
import ai.koog.agents.features.eventHandler.feature.handleEvents
import ai.koog.prompt.executor.llms.all.simpleOpenAIExecutor
import ai.koog.prompt.executor.clients.openai.OpenAIModels
@LLMDescription ( "Tools for basic math operations" )
class MathTools : ToolSet {
@Tool
@LLMDescription ( "Adds two numbers" )
fun add (
@LLMDescription ( "First number" ) a: Double ,
@LLMDescription ( "Second number" ) b: Double
) = (a + b). toString ()
@Tool
@LLMDescription ( "Multiplies two numbers" )
fun multiply (
@LLMDescription ( "First number" ) a: Double ,
@LLMDescription ( "Second number" ) b: Double
) = (a * b). toString ()
}
suspend fun main () {
val apiKey = System. getenv ( "OPENAI_API_KEY" )
?: error ( "Set OPENAI_API_KEY environment variable" )
val tools = ToolRegistry { tools ( MathTools (). asTools ()) }
simpleOpenAIExecutor (apiKey). use { executor ->
val agent = AIAgent (
promptExecutor = executor,
systemPrompt = "You are a helpful math assistant." ,
llmModel = OpenAIModels.Chat.GPT4oMini,
toolRegistry = tools
) {
handleEvents {
onToolCallStarting { println ( "🔧 ${ it.toolName } " ) }
onAgentCompleted { println ( "✅ Done!" ) }
}
}
val result = agent. run ( "Calculate (10 + 5) * 3" )
println ( "Result: $result " )
}
}
What You’ve Learned
Agent Creation How to create an AI agent with a system prompt and LLM model
Tool Definition How to define and register tools using annotations
Event Handling How to observe agent behavior with event handlers
LLM Providers How to switch between different LLM providers
Next Steps
Explore Advanced Features
More Examples
Explore more advanced patterns:
Streaming Responses Process LLM responses in real-time
Multi-Agent Systems Build agents that work together
Persistent Agents Save and restore agent state
RAG with Memory Add long-term memory with vector storage
Get Help
Need assistance? We’re here to help: