Skip to main content
POST
/
api
/
settings
/
test-connection
Test Connection
curl --request POST \
  --url https://api.example.com/api/settings/test-connection \
  --header 'Content-Type: application/json' \
  --data '
{
  "provider": "<string>",
  "key": "<string>"
}
'
{
  "success": true,
  "provider": "<string>",
  "message": "<string>"
}

Overview

Verify that an AI provider API key is valid and the service is accessible by making a test completion request. This endpoint does not store the key; it only validates connectivity.

Authentication

Requires authentication via session cookie or bearer token.

How It Works

The endpoint performs the following validation:
  1. Creates a temporary provider instance with the supplied key
  2. Sends a minimal completion request (“Say hello in one word”)
  3. Uses provider-specific test models:
    • Anthropic: claude-sonnet-4-20250514
    • OpenAI: gpt-4o-mini
    • Ollama: llama3.2
  4. Returns success if the provider responds, along with the response preview
This is a “dry run” endpoint. It tests the key without storing it. Use POST /api/settings/api-keys to actually store a key after validation.

Request Body

provider
string
required
AI provider to test. Must be one of: anthropic, openai, ollama
key
string
required
The API key or base URL to test. For Ollama, this should be the base URL (e.g., http://localhost:11434).

Response

success
boolean
Whether the connection test succeeded
provider
string
The provider name that was tested
message
string
Human-readable result message. On success, includes a preview of the AI response (first 100 characters). On failure, includes the error details.

Example

curl -X POST https://api.heimdall.dev/api/settings/test-connection \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "provider": "anthropic",
    "key": "sk-ant-api03-..."
  }'

Response Examples

Successful Connection

{
  "status": "ok",
  "data": {
    "success": true,
    "provider": "anthropic",
    "message": "Connection successful. Provider: anthropic. Response: Hello"
  }
}

Failed Connection

{
  "status": "ok",
  "data": {
    "success": false,
    "provider": "anthropic",
    "message": "Connection failed: Invalid API key"
  }
}
Even when the connection test fails, the HTTP status is 200 OK. Check the success field in the response data to determine if the test passed.

Error Responses

400 Bad Request: Unsupported provider (must be anthropic, openai, or ollama)

Provider-Specific Notes

Anthropic

  • Expects API keys in format sk-ant-api03-...
  • Tests with claude-sonnet-4-20250514 model
  • Validates against Anthropic’s API endpoint

OpenAI

  • Expects API keys in format sk-...
  • Tests with gpt-4o-mini model
  • Validates against OpenAI’s API endpoint

Ollama

  • Expects a base URL (e.g., http://localhost:11434)
  • Tests with llama3.2 model
  • Requires Ollama service to be running and accessible
  • Useful for validating self-hosted Ollama instances

Use Cases

  1. Pre-validation: Test a key before storing it to ensure it’s valid
  2. Troubleshooting: Diagnose connectivity issues with AI providers
  3. Setup Verification: Confirm that Ollama or other services are properly configured
  4. Key Rotation: Verify new keys before replacing old ones

Security Notes

  • The key is not stored or logged
  • The test request uses minimal tokens (max_tokens: 32)
  • Temperature is set to 0.0 for consistent results
  • Only the first 100 characters of the response are returned

Implementation Reference

See src/routes/settings.rs:344 for the endpoint implementation.

Build docs developers (and LLMs) love