Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/dais-polymtl/flock/llms.txt

Use this file to discover all available pages before exploring further.

Flock treats models, prompts, and API credentials as first-class SQL resources. You create and manage them with dedicated SQL commands — no config files, no environment variables to juggle across sessions. Each resource type has its own storage table that is initialized per database when Flock loads, and every resource can be scoped as either local (the current database) or global (shared across all databases in the session).

Model configuration

A model record maps a short model_name alias to the underlying provider model, its serialization format, batching behavior, and any provider-specific parameters. The fields stored for each model are:
model_name
string
required
Unique alias you use to reference this model in LLM function calls (e.g., 'gpt-4o').
model_type
string
required
The underlying model identifier as the provider expects it (e.g., 'gpt-4o', 'llama3.1').
provider
string
required
The provider that serves the model. One of openai, azure, ollama, or anthropic.
tuple_format
string
default:"XML"
How input rows are serialized into the prompt. One of JSON, XML, or Markdown. Defaults to XML when not specified.
batch_size
integer
default:"2048"
Maximum number of rows passed to the model in a single API call. Defaults to 2048 when not specified.
model_parameters
object
Provider-specific parameters passed directly to the API (e.g., temperature, top_p, n, frequency_penalty). Expressed as a JSON object.

Reading models

-- List all available models (system-defined and user-defined)
GET MODELS;

-- Inspect a specific model
GET MODEL 'model_name';

Creating models

Omitting the config object uses the defaults for tuple_format (XML) and batch_size (2048):
-- Minimal: no config object
CREATE MODEL('my-gpt4o', 'gpt-4o', 'openai');

-- With JSON tuple format and temperature
CREATE MODEL(
    'my-gpt4o',
    'gpt-4o',
    'openai',
    {
        "tuple_format": "JSON",
        "batch_size": 8,
        "model_parameters": {
            "temperature": 0.2,
            "top_p": 0.95
        }
    }
);

-- With XML tuple format
CREATE MODEL(
    'my-llama',
    'llama3.1',
    'ollama',
    {
        "tuple_format": "XML",
        "batch_size": 8,
        "model_parameters": {
            "n": 3,
            "frequency_penalty": 0.1
        }
    }
);

-- With Markdown tuple format
CREATE MODEL(
    'my-claude',
    'claude-opus-4-5',
    'anthropic',
    {
        "tuple_format": "Markdown",
        "batch_size": 8
    }
);

Global and local models

By default, a model is local to the current database. Use CREATE GLOBAL MODEL to make it available across all databases:
-- Global model — available in all databases
CREATE GLOBAL MODEL(
    'shared-gpt4o',
    'gpt-4o',
    'openai',
    {
        "tuple_format": "JSON",
        "batch_size": 8,
        "model_parameters": {
            "temperature": 0.2,
            "top_p": 0.95
        }
    }
);

-- Local model — explicit keyword, same as the default
CREATE LOCAL MODEL('local-llama', 'llama3.1', 'ollama');

-- Promote or demote an existing model
UPDATE MODEL 'shared-gpt4o' TO LOCAL;
UPDATE MODEL 'local-llama' TO GLOBAL;

Updating and deleting models

-- Update model (same config rules as CREATE)
UPDATE MODEL(
    'my-gpt4o',
    'gpt-4o',
    'openai',
    {
        "tuple_format": "JSON",
        "batch_size": 16,
        "model_parameters": {
            "temperature": 0.5
        }
    }
);

-- Delete a model
DELETE MODEL 'my-gpt4o';

Using a model in a query

Reference the model by its model_name alias in any Flock LLM function:
SELECT llm_complete(
    {'model_name': 'my-gpt4o'},
    {'prompt_name': 'product-description'},
    {'input_text': product_description}
) AS generated_description
FROM products;

Build docs developers (and LLMs) love