SVM and KNN hyperparameter
9 ビュー (過去 30 日間)
古いコメントを表示
I am attempting to optimize KNN and SVM classifiers with any optimization algorithm except Naivebayes.
can anyone help, please?
0 件のコメント
回答 (1 件)
Alan Weiss
2022 年 7 月 1 日
Is this what you are looking for?
Alan Weiss
MATLAB mathematical toolbox documentation
2 件のコメント
Alan Weiss
2022 年 7 月 4 日
To use a different algorithm you would have to run a different solver. I am not at all sure of the benefit of using a different algorithm.And I am not familiar with the Bath algorithm. Really, what do you expect to get that is better?
To use ga (genetic algorithm), you need a Global Optimization Toolbox license. To minimize the cross-validation error, you might want to fix the partition, and then use ga to minimize the error over several parameter settings. Something like this:
rng default
c = cvpartition(n,'Kfold',5); % Fix a partition
% I assume that you want to optimize over x(1)=BoxConstraint and x(2)=KernelScale
lb = [1/10,1/10]; % Somewhat arbitrary bounds
ub = [10,10];
% Minimize the cross-validation loss
fun = @(x)kfoldloss(fitcsvm(X,Y,'CVPartition',c,'BoxConstraint',x(1),'KernelScale',x(2)));
[sol,fval] = ga(fun,2,[],[],[],[],lb,ub);
% Now train the model on the optimal parameters
model = fitcsvm(X,Y,'BoxConstraint',sol(1),'KernelScale',sol(2));
But again, before you do this, I believe you should think about what you expect to get that is better than the automatic hyperparameter optimization using Bayesian optimization.
Alan Weiss
MATLAB mathematical toolbox documentation
参考
カテゴリ
Help Center および File Exchange で Traveling Salesman (TSP) についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!