Overview
Constraints files provide a way to:- Specify preferred metrics and evaluation criteria
- Guide model selection and algorithm choices
- Control preprocessing and feature engineering
- Define termination conditions
- Enforce business rules and requirements
File Format
Constraints are written in Markdown with section headers (##) defining different constraint categories. Each section contains bullet points describing specific constraints.Standard Sections
While you can create custom sections, these standard sections are recognized by the autopilot:Metrics
Define which metrics to optimize and track:- Regression: RMSE, MAE, R², MAPE
- Classification: Accuracy, F1, Precision, Recall, ROC-AUC
Models
Specify model preferences and restrictions:- Tree-based: Decision Trees, Random Forest, Extra Trees
- Boosting: XGBoost, LightGBM, CatBoost, AdaBoost
- Linear: Linear/Logistic Regression, Ridge, Lasso, ElasticNet
- Neural networks: MLP, deep learning models
- Other: SVM, KNN, Naive Bayes
Preprocessing
Control data preprocessing and feature engineering:- Scaling: StandardScaler, MinMaxScaler, RobustScaler
- Imputation: Mean, median, mode, forward-fill, KNN imputation
- Encoding: One-hot, label encoding, target encoding
- Transformations: Log, square root, box-cox
- Feature engineering: Polynomial features, interactions, binning
Termination
Define when experiments should stop:Features
Constraints about feature selection and engineering:Validation
Specify validation strategy:Performance
Resource and runtime constraints:Complete Example
Here’s a real constraints file from the sample data:Example Use Cases
Financial Modeling
For a loan default prediction model:E-commerce Recommendations
For predicting customer purchase amounts:Medical Diagnosis
For disease prediction with strict accuracy requirements:Usage
Pass your constraints file using the--constraints (or -c) flag:
How Constraints Are Applied
The autopilot interprets constraints at different stages:- Experiment Planning - Gemini reads constraints and creates an initial experiment plan
- Model Selection - Filters and prioritizes models based on preferences
- Preprocessing - Applies specified preprocessing steps
- Evaluation - Uses specified metrics and validation strategies
- Iteration - Checks termination conditions after each experiment
- Reporting - Highlights compliance with constraints in final report
Natural Language Flexibility
Constraints are interpreted by Gemini, so you can write them in natural language:Best Practices
Be Specific
Be Specific
Vague constraints like “use good models” are less helpful than specific ones like “prefer XGBoost and LightGBM for tabular data”.
Explain Why
Explain Why
Including reasoning helps Gemini make better decisions:
Prioritize Constraints
Prioritize Constraints
Use language like “must”, “prefer”, “avoid” to indicate importance:
- Must = Hard requirement
- Prefer = Strong preference but flexible
- Avoid = Try to avoid but acceptable if needed
Don't Over-Constrain
Don't Over-Constrain
Too many restrictions can limit the autopilot’s ability to find good solutions. Start with a few key constraints and add more as needed.
Test Without Constraints First
Test Without Constraints First
Run one experiment without constraints to see the baseline, then add constraints to guide improvements.
Constraint Validation
The autopilot will:- Parse and validate your constraints file
- Warn about conflicting constraints
- Show how constraints are being applied (use
--verbose) - Report constraint compliance in the final results
Examples Directory
For more examples, see the sample constraints in the repository:See Also
- Run Command - How to use constraints in commands
- Arguments Reference - The
--constraintsargument - Configuration - Gemini API setup for constraint interpretation