AdaBoostClassifier
Adaptive Boosting classifier for binary classification. Fits a sequence of weak learners on weighted samples, focusing on difficult cases.
Constructor
import { AdaBoostClassifier } from "bun-scikit";
const clf = new AdaBoostClassifier(
null, // estimator factory (null = use default)
{
nEstimators: 50,
learningRate: 1.0,
randomState: 42
}
);
Parameters
Constructor signature:
new AdaBoostClassifier(
estimatorFactory?: (() => ClassifierLike) | null,
options?: AdaBoostClassifierOptions
)
estimatorFactory
(() => ClassifierLike) | null
default:"null"
Factory function that creates a weak learner. If null, uses DecisionTreeClassifier with max depth 1 (decision stump).The weak learner must implement:
fit(X: Matrix, y: Vector): any
predict(X: Matrix): Vector
featureImportances_?: Vector | null (optional)
Maximum number of estimators to train. Training may stop early if perfect fit is achieved.
Learning rate shrinks the contribution of each classifier. Lower values require more estimators but can improve generalization.
Random seed for reproducible weighted sampling.
Methods
fit()
Train the AdaBoost classifier.
clf.fit(X: Matrix, y: Vector): AdaBoostClassifier
Training data of shape [n_samples, n_features].
Binary target values (0 or 1).
AdaBoostClassifier only supports binary classification.
predict()
Predict class labels for samples.
clf.predict(X: Matrix): Vector
Samples to predict, shape [n_samples, n_features].
Returns: Predicted binary class labels (0 or 1).
predictProba()
Predict class probabilities for samples.
clf.predictProba(X: Matrix): Matrix
Returns: Matrix of shape [n_samples, 2] with probabilities [P(class=0), P(class=1)].
decisionFunction()
Compute the weighted sum of predictions from all estimators.
clf.decisionFunction(X: Matrix): Vector
Returns: Decision function values. Positive values predict class 1, negative values predict class 0.
score()
Return the accuracy on the given test data.
clf.score(X: Matrix, y: Vector): number
Returns: Accuracy score between 0 and 1.
Properties
Collection of fitted weak learners.
Weights for each estimator in the ensemble. Higher weights indicate more accurate estimators.
Aggregated feature importances, weighted by estimator accuracy.
Example: Default Estimator
import { AdaBoostClassifier } from "bun-scikit";
// Use default estimator (decision stumps)
const clf = new AdaBoostClassifier(null, {
nEstimators: 50,
learningRate: 1.0,
randomState: 42
});
// Binary classification: fraud detection
const X = [
[100, 1, 0], // amount, international, verified
[50, 0, 1],
[5000, 1, 0],
[30, 0, 1],
[10000, 1, 0],
[200, 0, 1]
];
const y = [0, 0, 1, 0, 1, 0]; // 0=legit, 1=fraud
// Train
clf.fit(X, y);
// Predict
const testX = [
[150, 1, 0],
[40, 0, 1]
];
const predictions = clf.predict(testX);
console.log(predictions); // [1, 0]
// Get probabilities
const proba = clf.predictProba(testX);
console.log(proba);
// [[0.35, 0.65], [0.82, 0.18]]
// Estimator weights
console.log("Estimator weights:");
clf.estimatorWeights_.forEach((w, i) => {
console.log(` Tree ${i}: ${w.toFixed(4)}`);
});
Example: Custom Estimator
import { AdaBoostClassifier } from "bun-scikit";
import { DecisionTreeClassifier } from "bun-scikit";
// Use deeper trees as weak learners
const clf = new AdaBoostClassifier(
() => new DecisionTreeClassifier({ maxDepth: 3 }),
{
nEstimators: 30,
learningRate: 0.8,
randomState: 42
}
);
clf.fit(X, y);
const predictions = clf.predict(testX);
AdaBoostRegressor
Adaptive Boosting regressor for continuous target variables. Fits a sequence of weak learners, focusing on samples with high prediction error.
Constructor
import { AdaBoostRegressor } from "bun-scikit";
const reg = new AdaBoostRegressor(
null, // estimator factory (null = use default)
{
nEstimators: 50,
learningRate: 1.0,
randomState: 42
}
);
Parameters
Constructor signature:
new AdaBoostRegressor(
estimatorFactory?: (() => RegressorLike) | null,
options?: AdaBoostRegressorOptions
)
estimatorFactory
(() => RegressorLike) | null
default:"null"
Factory function that creates a weak learner. If null, uses DecisionTreeRegressor with max depth 3.The weak learner must implement:
fit(X: Matrix, y: Vector): any
predict(X: Matrix): Vector
featureImportances_?: Vector | null (optional)
Maximum number of estimators to train.
Learning rate shrinks the contribution of each regressor.
Random seed for reproducible weighted sampling.
Methods
fit()
Train the AdaBoost regressor.
reg.fit(X: Matrix, y: Vector): AdaBoostRegressor
Training data of shape [n_samples, n_features].
Continuous target values.
predict()
Predict target values using weighted median of estimators.
reg.predict(X: Matrix): Vector
Samples to predict, shape [n_samples, n_features].
Returns: Predicted continuous values (weighted average across estimators).
score()
Return the R² score on the given test data.
reg.score(X: Matrix, y: Vector): number
Returns: R² score (coefficient of determination).
getParams() / setParams()
Get or set hyperparameters.
reg.getParams(): AdaBoostRegressorOptions
reg.setParams(params: Partial<AdaBoostRegressorOptions>): AdaBoostRegressor
Properties
Collection of fitted weak learners.
Weights for each estimator based on prediction error.
Aggregated feature importances, weighted by estimator performance.
Example
import { AdaBoostRegressor } from "bun-scikit";
// Use default estimator (depth-3 trees)
const reg = new AdaBoostRegressor(null, {
nEstimators: 50,
learningRate: 1.0,
randomState: 42
});
// Energy consumption prediction
const X = [
[20, 50, 1000], // temp, humidity, sqft
[25, 45, 1200],
[18, 60, 900],
[30, 40, 1500],
[22, 55, 1100]
];
const y = [45, 52, 38, 68, 48]; // kWh
// Train
reg.fit(X, y);
// Predict
const testX = [
[23, 48, 1050],
[28, 42, 1400]
];
const predictions = reg.predict(testX);
console.log("Predicted energy consumption:");
predictions.forEach((kwh, i) => {
console.log(` Sample ${i + 1}: ${kwh.toFixed(2)} kWh`);
});
// R² score
const r2 = reg.score(X, y);
console.log(`R² score: ${r2.toFixed(4)}`);
// Feature importances
const features = ["temperature", "humidity", "sqft"];
console.log("\nFeature importances:");
reg.featureImportances_?.forEach((imp, i) => {
console.log(` ${features[i]}: ${imp.toFixed(4)}`);
});
// Update parameters
reg.setParams({ learningRate: 0.8, nEstimators: 100 });
Example: Custom Estimator
import { AdaBoostRegressor } from "bun-scikit";
import { DecisionTreeRegressor } from "bun-scikit";
// Use shallow trees for faster training
const reg = new AdaBoostRegressor(
() => new DecisionTreeRegressor({
maxDepth: 2,
minSamplesLeaf: 5
}),
{
nEstimators: 100,
learningRate: 0.8
}
);
reg.fit(X, y);
const predictions = reg.predict(testX);