A datasource is SQLBot’s connection to your data. Before you can ask any natural language questions, you need to connect at least one database or upload a file. SQLBot uses your datasource to introspect the schema, surface table and field metadata to the LLM, and execute the generated SQL. This page walks through every step of adding and configuring a datasource.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/dataease/SQLBot/llms.txt
Use this file to discover all available pages before exploring further.
Supported database types
SQLBot can connect to the following systems out of the box:MySQL
PostgreSQL
Microsoft SQL Server
Oracle
ClickHouse
Elasticsearch
AWS Redshift
Apache Hive
Apache Doris
StarRocks
DM (达梦)
Kingbase
SQLite
Excel / CSV
Only workspace admins can add, edit, or delete datasources. Regular workspace members can query existing datasources but cannot modify connection settings.
Adding a database connection
Navigate to Datasources
In the left sidebar, click Datasources. The list shows all datasources already connected to your workspace.
Choose the database type
Select your database engine from the type dropdown. The form adapts to show the correct connection fields for that engine (host, port, database name, username, password, and any engine-specific options).
Fill in connection details
Enter the connection parameters. For example, a PostgreSQL connection requires:
Test the connection
Click Test Connection. SQLBot attempts to connect and returns a success or failure status. If the test fails, double-check host reachability, credentials, and firewall rules before saving.
Selecting tables
After saving a datasource, SQLBot fetches all available tables from the database. You must explicitly select which tables SQLBot is allowed to query. This controls both what the LLM sees in its schema context and what SQL it is permitted to generate.Enable the tables you want
Toggle on each table that SQLBot should be able to query. Untoggled tables are excluded from schema context entirely — the LLM will not know they exist.
Annotating table and field metadata
For each enabled table, you can add a description (called a custom comment) to both the table itself and each of its columns. These descriptions are injected into the LLM’s schema context and directly affect query accuracy. To edit metadata:- Click a table name to expand it.
- Enter a plain-language description in the Table description field. For example:
Stores one row per completed customer order, including totals and shipping status. - For each column, add a Field description explaining what it contains. For example, a column named
gmvmight get the descriptionGross merchandise value in USD, before refunds. - Click Save.
- Export schema downloads an
.xlsxfile with all your tables and field descriptions pre-filled. - Import schema lets you fill in or update descriptions in the spreadsheet and re-upload it, applying all changes in one batch.
Uploading Excel or CSV files
Excel and CSV files are treated as a special datasource type. SQLBot imports the file into its internal PostgreSQL store, making the data queryable just like any other database.Upload your file
Drag and drop or browse to your
.xlsx, .xls, or .csv file. SQLBot previews the detected columns and inferred data types before importing.Confirm column types
Review the inferred field types (string, integer, decimal, and so on). Adjust any types that were detected incorrectly, then click Import.
Each sheet in an
.xlsx file becomes a separate table. For .csv files, a single table named Sheet1 is created. Table names are auto-generated with a short unique suffix to avoid collisions.Syncing schema changes
If your database schema changes after initial setup — new columns are added, tables are renamed — you can resync the field list for a table without removing and recreating the datasource:- Open the datasource and select the table.
- Click Sync fields.
- SQLBot fetches the latest column list from the live database and reconciles it with what it has stored. New fields are added; removed fields are marked inactive.
Managing existing datasources
From the datasource list you can:- Edit a datasource to update connection credentials or rename it.
- Test an existing connection at any time to verify it is still reachable.
- Delete a datasource, which also removes all associated table and field metadata. Any chat sessions that referenced the datasource will no longer be able to regenerate queries.