Prerequisites
Before you begin, make sure you have:
Go 1.22 or higher installed
A Dedalus API key
The Dedalus SDK installed (see Installation )
Your First Chat Completion
Let’s build a simple chat application that sends a message to an AI model and receives a response.
Create a new Go file
Create a new file called main.go: package main
import (
" context "
" fmt "
" github.com/dedalus-labs/dedalus-sdk-go "
" github.com/dedalus-labs/dedalus-sdk-go/option "
" github.com/dedalus-labs/dedalus-sdk-go/shared "
)
func main () {
// Initialize the client
client := githubcomdedaluslabsdedalussdkgo . NewClient (
option . WithAPIKey ( "My API Key" ), // Replace with your API key
)
// Create a chat completion
chatCompletion , err := client . Chat . Completions . New (
context . TODO (),
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParams {
Model : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParamsModelUnion ,
]( shared . UnionString ( "openai/gpt-5-nano" )),
Messages : githubcomdedaluslabsdedalussdkgo . F (
[] githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParamsMessageUnion {
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParam {
Role : githubcomdedaluslabsdedalussdkgo . F (
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParamRoleUser ,
),
Content : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParamContentUnion ,
]( shared . UnionString ( "Hello, how are you today?" )),
},
},
),
},
)
if err != nil {
panic ( err . Error ())
}
// Print the response
fmt . Printf ( "Completion ID: %s \n " , chatCompletion . ID )
fmt . Printf ( "Model: %s \n " , chatCompletion . Model )
fmt . Printf ( "Response: %s \n " , chatCompletion . Choices [ 0 ]. Message . Content )
}
Run your application
Execute the file: You should see output similar to: Completion ID: cmpl_123abc
Model: openai/gpt-5-nano
Response: Hello! I'm doing well, thank you for asking...
Understanding the Code
client := githubcomdedaluslabsdedalussdkgo . NewClient (
option . WithAPIKey ( "My API Key" ),
)
The client is initialized with your API key. You can also use environment variables: // Reads from DEDALUS_API_KEY environment variable
client := githubcomdedaluslabsdedalussdkgo . NewClient ()
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParams {
Model : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParamsModelUnion ,
]( shared . UnionString ( "openai/gpt-5-nano" )),
Messages : githubcomdedaluslabsdedalussdkgo . F ( ... ),
}
All request parameters are wrapped in a generic Field type using the F() helper. This prevents accidentally sending zero values and allows explicit null values.
chatCompletion , err := client . Chat . Completions . New ( ... )
if err != nil {
panic ( err . Error ())
}
The SDK returns structured response objects with type-safe access to all fields. Check for errors using standard Go error handling.
Multi-Turn Conversations
Build a conversation with system messages and multiple turns:
package main
import (
" context "
" fmt "
" github.com/dedalus-labs/dedalus-sdk-go "
" github.com/dedalus-labs/dedalus-sdk-go/option "
" github.com/dedalus-labs/dedalus-sdk-go/shared "
)
func main () {
client := githubcomdedaluslabsdedalussdkgo . NewClient (
option . WithAPIKey ( "My API Key" ),
)
chatCompletion , err := client . Chat . Completions . New (
context . TODO (),
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParams {
Model : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParamsModelUnion ,
]( shared . UnionString ( "openai/gpt-5-nano" )),
Messages : githubcomdedaluslabsdedalussdkgo . F (
[] githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParamsMessageUnion {
// System message sets the behavior
githubcomdedaluslabsdedalussdkgo . ChatCompletionSystemMessageParam {
Role : githubcomdedaluslabsdedalussdkgo . F (
githubcomdedaluslabsdedalussdkgo . ChatCompletionSystemMessageParamRoleSystem ,
),
Content : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionSystemMessageParamContentUnion ,
]( shared . UnionString ( "You are Stephen Dedalus. Respond in morose Joycean malaise." )),
},
// User message
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParam {
Role : githubcomdedaluslabsdedalussdkgo . F (
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParamRoleUser ,
),
Content : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParamContentUnion ,
]( shared . UnionString ( "Hello, how are you today?" )),
},
},
),
},
)
if err != nil {
panic ( err . Error ())
}
fmt . Printf ( " %+v \n " , chatCompletion . Choices [ 0 ]. Message . Content )
}
System messages help guide the model’s behavior and personality throughout the conversation.
Streaming Responses
For real-time applications, use streaming to receive responses as they’re generated:
package main
import (
" context "
" fmt "
" github.com/dedalus-labs/dedalus-sdk-go "
" github.com/dedalus-labs/dedalus-sdk-go/option "
" github.com/dedalus-labs/dedalus-sdk-go/shared "
)
func main () {
client := githubcomdedaluslabsdedalussdkgo . NewClient (
option . WithAPIKey ( "My API Key" ),
)
stream := client . Chat . Completions . NewStreaming (
context . TODO (),
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParams {
Model : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParamsModelUnion ,
]( shared . UnionString ( "openai/gpt-5-nano" )),
Messages : githubcomdedaluslabsdedalussdkgo . F (
[] githubcomdedaluslabsdedalussdkgo . ChatCompletionNewParamsMessageUnion {
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParam {
Role : githubcomdedaluslabsdedalussdkgo . F (
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParamRoleUser ,
),
Content : githubcomdedaluslabsdedalussdkgo . F [
githubcomdedaluslabsdedalussdkgo . ChatCompletionUserMessageParamContentUnion ,
]( shared . UnionString ( "Tell me a story" )),
},
},
),
},
)
// Iterate through the stream
for stream . Next () {
chunk := stream . Current ()
if len ( chunk . Choices ) > 0 && chunk . Choices [ 0 ]. Delta . Content != "" {
fmt . Print ( chunk . Choices [ 0 ]. Delta . Content )
}
}
if err := stream . Err (); err != nil {
panic ( err . Error ())
}
fmt . Println ()
}
Streaming is perfect for chat interfaces where you want to display text as it’s generated, providing a better user experience.
Error Handling
Handle errors gracefully using the SDK’s error types:
import (
" errors "
" fmt "
" net/http "
)
chatCompletion , err := client . Chat . Completions . New ( context . TODO (), params )
if err != nil {
var apierr * githubcomdedaluslabsdedalussdkgo . Error
if errors . As ( err , & apierr ) {
fmt . Printf ( "API Error: %d \n " , apierr . StatusCode )
fmt . Println ( string ( apierr . DumpRequest ( true )))
fmt . Println ( string ( apierr . DumpResponse ( true )))
}
panic ( err . Error ())
}
Next Steps
Chat Completions Explore advanced chat completion features
Embeddings Generate text embeddings for semantic search
Audio Processing Learn about speech synthesis and transcription
Image Generation Create and edit images with AI