SVM Cross Validation Training
3 ビュー (過去 30 日間)
古いコメントを表示
I am using K-Fold cross validation. My K is 10.
I am supposed to do 10 crossfold and take the average of the SVM performance.
How should i perform such? Running the cross validarion ounce only generates 1 fold prediction or a complete 10-fold prediction?
1 件のコメント
Mohammad Sami
2020 年 5 月 8 日
According to the documentation it is average over all folds
https://www.mathworks.com/help/releases/R2020a/stats/select-data-and-validation-for-classification-problem.html
回答 (1 件)
Gayathri
2025 年 1 月 3 日
I understand that you need to perform K-fold cross-validation for a SVM model. For this purpose you can use the "crossval" function. And then, "kfoldLoss" function can be used to get the classification loss for cross-validated classification model. Please refer to the code below which implements the same.
load ionosphere
%Train a SVM classifier using the radial basis kernel
SVMModel = fitcsvm(X,Y,'Standardize',true,'KernelFunction','RBF','KernelScale','auto');
%Cross-validate the SVM classifier
CVSVMModel = crossval(SVMModel);
%Estimate the out-of-sample misclassification rate.
classLoss = kfoldLoss(CVSVMModel)
"crossval" by default uses 10-fold cross-validation.
Please refer to the "Train and Cross-Validate SVM Classifier" example in the documentation link mentioned below.
Hope you find this information helpful!
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Statistics and Machine Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!