SVM training with hyperparameter "CVPartition"

5 ビュー (過去 30 日間)
hobin Hwang
hobin Hwang 2022 年 6 月 16 日
コメント済み: hobin Hwang 2022 年 6 月 19 日
I recently came across a variety of hyperparameters while training an SVM. Among them, I am curious about the relationship between the parameters "CVPartition" and "OptimizeHyperparameters".
  1. Is CVPartition applied only when 'OptimizeHyperparameters' is set to 'none'?
  2. When 'OptimizeHyperparameters' set to 'auto' , the code is executed. So, does cross validation not proceed?
  3. Even if you set and run 10fold, there is only one model with the classificationSVM property that is provided as an output. Is this one model applied as a parameter with the best performance among the ten models?
Here's my code, I'd appreciate it if you could tell me about the problem.
classificationSVM = fitcsvm(...
Md_train_data, ... % training set
Label, ... % training set label
'KernelFunction', 'gaussian', ...
'PolynomialOrder', [], ...
'KernelScale', 'Auto', ...
'BoxConstraint', 1, ...
'Standardize', true, ...
'ClassNames', [1; 2], ...
'OptimizeHyperparameters','none', ... % Here is points
'HyperparameterOptimizationOptions',struct('Optimizer','gridsearch','NumGridDivisions',10,...
'AcquisitionFunctionName','expected-improvement-plus','ShowPlots',false,...
'CVPartition', cvpartition(Label, 'kfold', 10))); % and here is point

採用された回答

Alan Weiss
Alan Weiss 2022 年 6 月 16 日
編集済み: Alan Weiss 2022 年 6 月 16 日
I think that you have a misunderstanding about what these options do. Look in the first paragraph of the documentation of HyperparameterOptimizationOptions for fitcsvm: "This argument modifies the effect of the OptimizeHyperparameters name-value argument." This means, among other things, if you set 'OptimizeHyperparameters','none' as you suggest, then it doesn't matter what you set for HyperparameterOptimizationOptions because there will be no optimization.
I am not sure that I understand your second question. If you set 'OptimizeHyperparameters','auto' then the optimized hyperparameters are {'BoxConstraint','KernelScale'}, as documented. As the documentation states, the thing being optimized the the cross-validation loss: "The optimization attempts to minimize the cross-validation loss (error) for fitcsvm by varying the parameters." So there is indeed cross-validation in this case.
For your third question, I think you have a misunderstanding about what is reported back. After the optimization is completed and the optimal hyperparameters are found, fitcsvm uses those optimal hyperparameters and fits a classifier. You do not get the results as a cross-validation (ClassificationPartitionedModel) object, but as a classification object. OK?
Alan Weiss
MATLAB mathematical toolbox documentation
  3 件のコメント
Walter Roberson
Walter Roberson 2022 年 6 月 17 日
Well you could do that, but you would have to do it "by hand" by calling the function a number of times with different parameters. 'none' on the optimization option means that cross-validation will not be done by the function.
hobin Hwang
hobin Hwang 2022 年 6 月 19 日
Thanks for your kind reply. Your advice has been helpful.

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeModel Building and Assessment についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by