Overview
The Control module provides advanced algorithms including PID controllers, fuzzy logic systems, and machine learning classifiers for embedded automation and robotics.
PID Controllers
PID (Basic)
Basic PID controller implementation with output limiting.
#define ENABLE_MODULE_PID
#include "Kinematrix.h"
PID pid;
pid . setConstants ( 2.0 , 1.0 , 0.5 , 0.1 ); // Kp, Ki, Kd, Td
pid . setOutputRange ( 0 , 255 );
pid . calculate (setpoint, actualValue);
float output = pid . getOutput ();
Calculate PID output Actual value (process variable)
Get calculated output Returns PID control output
Get current error (setpoint - actual)
Reset PID state (clear integral)
PIDv3 (Advanced)
Advanced PID with auto-tuning and additional features.
#define ENABLE_MODULE_PID
#include "Kinematrix.h"
PIDv3 pid;
pid . setTunings ( 2.0 , 1.0 , 0.5 ); // Kp, Ki, Kd
pid . setOutputLimits ( 0 , 255 );
pid . setMode (AUTOMATIC);
if ( pid . compute (setpoint, input)) {
float output = pid . getOutput ();
// Apply output
}
Set PID tuning parameters Proportional on Error or Measurement (P_ON_E or P_ON_M)
Set controller mode AUTOMATIC (1) or MANUAL (0)
Compute PID output Returns true if output was computed
Set controller direction DIRECT (0) or REVERSE (1)
Fuzzy Logic Systems
FuzzyMamdani
Mamdani fuzzy logic controller with multiple defuzzification methods.
#define ENABLE_MODULE_FUZZY_MAMDANI
#include "Kinematrix.h"
FuzzyMamdani fuzzy ( 2 , 1 , 10 , 3 ); // 2 inputs, 1 output, 10 rules, 3 sets/var
// Add input variable
fuzzy . addInputVariable ( "temperature" , 0 , 40 );
float tempLow[] = { 0 , 0 , 20 }; // Triangular membership
fuzzy . addFuzzySet ( 0 , true , "low" , TRIANGULAR, tempLow);
// Add output variable
fuzzy . addOutputVariable ( "fanSpeed" , 0 , 100 );
float speedSlow[] = { 0 , 0 , 50 };
fuzzy . addFuzzySet ( 0 , false , "slow" , TRIANGULAR, speedSlow);
// Add rule: IF temp is low THEN speed is slow
int antecedentVars[] = { 0 };
int antecedentSets[] = { 0 };
fuzzy . addRule (antecedentVars, antecedentSets, 1 , 0 , 0 , true );
float inputs[] = { 25.0 };
float * outputs = fuzzy . evaluate (inputs);
Create Mamdani fuzzy system Maximum number of input variables
Maximum number of output variables
Maximum fuzzy sets per variable
Add fuzzy set to variable True for input, false for output
TRIANGULAR, TRAPEZOIDAL, GAUSSIAN, SINGLETON
Parameters array (3 for triangular, 4 for trapezoidal, etc.)
Add fuzzy rule Array of antecedent variable indices
Array of antecedent set indices
Consequent variable index
True for AND operator, false for OR
Evaluate fuzzy system Returns array of output values
Set defuzzification method CENTROID, BISECTOR, MOM, SOM, LOM
Machine Learning
KNN (K-Nearest Neighbors)
K-Nearest Neighbors classifier with multiple distance metrics.
#define ENABLE_MODULE_KNN
#include "Kinematrix.h"
KNN knn ( 3 , 4 , 100 ); // k=3, 4 features, 100 max samples
// Add training data
float sample1[] = { 5.1 , 3.5 , 1.4 , 0.2 };
knn . addTrainingData ( "setosa" , sample1);
float sample2[] = { 6.3 , 2.9 , 5.6 , 1.8 };
knn . addTrainingData ( "virginica" , sample2);
// Predict
float test[] = { 5.0 , 3.0 , 1.5 , 0.3 };
const char * prediction = knn . predict (test);
Serial . println (prediction); // "setosa"
Create KNN classifier Number of features per sample
Predict class for sample Feature array to classify
Returns predicted class label
Set distance calculation method EUCLIDEAN, MANHATTAN, or COSINE
Enable/disable feature normalization True to enable normalization
Perform k-fold cross-validation Returns accuracy (0.0-1.0)
DecisionTree
Decision tree classifier supporting mixed input types and multiple split criteria.
#define ENABLE_MODULE_DECISION_TREE
#include "Kinematrix.h"
DecisionTree tree ( 4 , 100 , 10 ); // 4 features, 100 samples, depth 10
// Add training samples
FeatureValue features1[] = { 5.1 , 3.5 , 1.4 , 0.2 };
tree . addTrainingSample (features1, "setosa" );
// Train the tree
tree . train (ENTROPY, COST_COMPLEXITY);
// Predict
FeatureValue test[] = { 5.0 , 3.0 , 1.5 , 0.3 };
const char * prediction = tree . predictClass (test);
Create decision tree Minimum samples to split node
Add training sample (classification)
Train the decision tree criterion
SplitCriterion
default: "MIXED_CRITERION"
GINI, ENTROPY, MSE, MAE, or MIXED_CRITERION
pruning
PruningMethod
default: "COST_COMPLEXITY"
NO_PRUNING, COST_COMPLEXITY, or REDUCED_ERROR
Predict class for sample Returns predicted class label
Get feature importance scores Returns array of importance values (0.0-1.0)
Print tree structure to Serial
Algorithm Execution Time Memory Use Case PID ~100μs Minimal Temperature, motor control Fuzzy Logic ~500μs - 2ms Medium HVAC, irrigation KNN Variable High Pattern recognition Decision Tree ~200μs Medium Classification
ESP32 Full support including complex ML
ESP8266 All algorithms with optimization
AVR PID and basic fuzzy logic