loss returns very low values in feature forward selection

1 回表示 (過去 30 日間)
Esmeralda Ruiz Pujadas
Esmeralda Ruiz Pujadas 2022 年 1 月 21 日
Dear all,
I wonder because loss returns very low values different to classification error in Feature forward selection. For example:
classifierfun = @(train_data,train_labels,test_data,test_labels) ...
loss(fitcsvm(train_data,train_labels,'KernelFunction',
'gaussian','KernelScale','auto','Standardize',true),test_data,test_labels,'LossFun', 'ClassifError');
[fs,history] = sequentialfs(classifierfun,table2array(TableFeaturesNormalized),Y,'
cv',c,'nfeatures',min(size(TableFeaturesNormalized,2),max_its_fs),'options',opts)
I get
Step 1, added column 178, criterion value 0.00996737
Step 2, added column 245, criterion value 0.00997051
The same in here
opts = statset(‘display’,’iter’);
costfun = @(XT,yT,Xt,yt)loss(fitcecoc(XT,yT),Xt,yt);
[fs, history] = sequentialfs(costfun, X_train,
y_train, ‘cv’, cv, ‘options’, opts);
why is this criterion value so low if it is a classification error?
However, if I do
classifierfun = @(train_data,train_labels,test_data,test_labels) ...
sum(predict(fitcsvm(train_data,train_labels,'KernelFunction', 'gaussian','Standardize',true),
test_data) ~= test_labels);
The values make sense
Step 1, added column 178, criterion value 0.36233363
Step 2, added column 245, criterion value 0.35302325
Thank you for the help

採用された回答

Kumar Pallav
Kumar Pallav 2022 年 2 月 1 日
Hi,
As per my understanding, sequentialfs sums the values returned by 'classifierfun' and divides that sum by the total number of test observations. This is the reason you are getting low values of criterion. You may refer this for details on sequentialfs.
Hope it helps!
  1 件のコメント
Esmeralda Ruiz Pujadas
Esmeralda Ruiz Pujadas 2022 年 2 月 4 日
Thank you very much. Ok now I understand why it has so low values compared with the svm error.
Thanks

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeGet Started with Statistics and Machine Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by