DecisionTreeClassifier
A decision tree classifier using CART algorithm with Gini impurity.Constructor
Parameters
Maximum depth of the tree. Deeper trees can model more complex patterns but may overfit.
Minimum number of samples required to split an internal node.
Minimum number of samples required to be at a leaf node.
Number of features to consider when looking for the best split:
"sqrt": sqrt(n_features)"log2": log2(n_features)number: specific number of featuresnull: use all features
Random seed for reproducible feature selection when maxFeatures < n_features.
Methods
fit()
Train the decision tree classifier.Training data of shape [n_samples, n_features].
Target values (class labels).
predict()
Predict class labels for samples.Samples to predict, shape [n_samples, n_features].
score()
Return the accuracy on the given test data.dispose()
Free native resources if using Zig backend.Properties
Unique class labels found during training.
Feature importances (Gini importance). Higher values indicate more important features.
Backend used for training:
"zig" (native) or "js" (JavaScript).Path to native library if Zig backend was used.
Zig Backend
DecisionTreeClassifier can use a Zig-powered native backend for faster training. The backend is automatically selected when:- Number of classes is between 2 and 256
BUN_SCIKIT_TREE_BACKENDis not set to"js"or"off"
Example
DecisionTreeRegressor
A decision tree regressor using CART algorithm with variance reduction.Constructor
Parameters
Maximum depth of the tree.
Minimum number of samples required to split an internal node.
Minimum number of samples required to be at a leaf node.
Number of features to consider when looking for the best split.
Random seed for reproducible feature selection.
Methods
fit()
Train the decision tree regressor.Training data of shape [n_samples, n_features].
Target values (continuous).
predict()
Predict target values for samples.Samples to predict, shape [n_samples, n_features].
score()
Return the R² score on the given test data.Properties
Feature importances based on variance reduction.