TreeBagger parameter tuning for classification

Illustration
kawther hassine - 2022-07-14T14:02:37+00:00
Question: TreeBagger parameter tuning for classification

How can I tune parameters for TreeBagger model for classification, I followed the example:"Tune Random Forest Using Quantile Error and Bayesian Optimization", https://fr.mathworks.com/help/stats/tune-random-forest-using-quantile-error-and-bayesian-optimization.html I only changed "regression" with "classification". The following code generated multiple errors:   results = bayesopt(@(params)oobErrRF(params,X),hyperparametersRF,... 'AcquisitionFunctionName','expected-improvement-plus','Verbose',0); errors: Error using classreg.learning.internal.table2FitMatrix>resolveName (line 232) One or more 'ResponseName' parameter values are invalid. Error in classreg.learning.internal.table2FitMatrix (line 77) ResponseName = resolveName('ResponseName',ResponseName,FormulaResponseName,false,VarNames); Error in ClassificationTree.prepareData (line 557) [X,Y,vrange,wastable,varargin] = classreg.learning.internal.table2FitMatrix(X,Y,varargin{:},'OrdinalIsCategorical',false); Error in TreeBagger/init (line 1335) ClassificationTree.prepareData(x,y,... Error in TreeBagger (line 615) bagger = init(bagger,X,Y,makeArgs{:}); Error in oobErrRF2 (line 16) randomForest = TreeBagger(300,X,'MPG','Method','classification',... Error in @(params)oobErrRF2(params,trainingDataFeatures) Error in BayesianOptimization/callObjNormally (line 2184) Objective = this.ObjectiveFcn(conditionalizeX(this, X)); Error in BayesianOptimization/callObjFcn (line 2145) = callObjNormally(this, X); Error in BayesianOptimization/callObjFcn (line 2162) = callObjFcn(this, X); Error in BayesianOptimization/performFcnEval (line 2128) ObjectiveFcnObjectiveEvaluationTime, this] = callObjFcn(this, this.XNext); Error in BayesianOptimization/run (line 1836) this = performFcnEval(this); Error in BayesianOptimization (line 450) this = run(this); Error in bayesopt (line 287) Results = BayesianOptimization(Options); would like to know if there is a way to use this method of tuning for classification. If not, how can I tune my parameters for a TreeBagger classifier.

Expert Answer

Profile picture of Prashant Kumar Prashant Kumar answered . 2025-11-20

The following works for me in R2018a. It predicts 'Cylinders' (3 classes) and it calls oobError to get the misclassification rate of the ensemble.

 

load carsmall
Cylinders = categorical(Cylinders);
Mfg = categorical(cellstr(Mfg));
Model_Year = categorical(Model_Year);
X = table(Acceleration,Cylinders,Displacement,Horsepower,Mfg,...
    Model_Year,Weight,MPG);
rng('default'); % For reproducibility

maxMinLS = 20;
minLS = optimizableVariable('minLS',[1,maxMinLS],'Type','integer');
numPTS = optimizableVariable('numPTS',[1,size(X,2)-1],'Type','integer');
hyperparametersRF = [minLS; numPTS];

results = bayesopt(@(params)oobErrRF(params,X),hyperparametersRF,...
    'AcquisitionFunctionName','expected-improvement-plus','Verbose',1);

bestOOBErr = results.MinObjective
bestHyperparameters = results.XAtMinObjective

Mdl = TreeBagger(300,X,'Cylinders','Method','classification',...
    'MinLeafSize',bestHyperparameters.minLS,...
    'NumPredictorstoSample',bestHyperparameters.numPTS);

function oobErr = oobErrRF(params,X)
%oobErrRF Trains random forest and estimates out-of-bag quantile error
%   oobErr trains a random forest of 300 regression trees using the
%   predictor data in X and the parameter specification in params, and then
%   returns the out-of-bag quantile error based on the median. X is a table
%   and params is an array of OptimizableVariable objects corresponding to
%   the minimum leaf size and number of predictors to sample at each node.
randomForest = TreeBagger(300,X,'Cylinders','Method','classification',...
    'OOBPrediction','on','MinLeafSize',params.minLS,...
    'NumPredictorstoSample',params.numPTS);
oobErr = oobError(randomForest, 'Mode','ensemble');
end

 


Not satisfied with the answer ?? ASK NOW

Get a Free Consultation or a Sample Assignment Review!