Info
この質問は閉じられています。 編集または回答するには再度開いてください。
this two codes is connected to each other when i run the first code it gives me error says : Error using KNN_ Too few input arguments,what i shoud do
1 回表示 (過去 30 日間)
古いコメントを表示
この 質問 は 2 人のコントリビューターによってフラグが設定されました
global A trn vald ;
SearchAgents_no=10; % Number of search agents
Max_iteration=100; % Maximum numbef of iterations
% A=load ('C:\Users\d\Downloads\archive (1).dat');
%digitDatasetPath = fullfile('C:\Users\d\Downloads\archive (1)');
load('f')
A=featuresall;
nVar=size(featuresall,2)-1;
r=randperm(size(featuresall,1));
trn=r(1:floor(length(r)/2));
vald=r(floor(length(r)/2)+1:end);
tic
[Best_score,Best_pos,Convergence_curve]=BGWOPSO(SearchAgents_no,(Max_iteration),0,1,size(A,2)-1,'AccSz');
time = toc;
acc = Acc(Best_pos);
fprintf('hybrid Acc %f\thybrid Fitness: %f\thybridSolution: %s\thybridDimention: %d\thybridTime: %f\n',acc,Best_score,num2str(Best_pos,'%1d'),sum(Best_pos(:)),time);
%second code
function [predicted_labels,nn_index,accuracy_] = KNN_(k,data,labels,t_data,t_labels)
%KNN_: classifying using k-nearest neighbors algorithm. The nearest neighbors
%search method is euclidean distance
%Usage:
% [~,~,accuracy] = KNN_(3,training,training_labels,testing,testing_labels);
% predicted_labels = KNN_(3,training,training_labels,testing)
%Input:
% - k: number of nearest neighbors
% - data: (NxD) training data; N is the number of samples and D is the
% dimensionality of each data point
% - labels: training labels
% - t_data: (MxD) testing data; M is the number of data points and D
% is the dimensionality of each data point
% - t_labels: testing labels (default = [])
%Output:
% - predicted_labels: the predicted labels based on the k-NN
% algorithm
% - nn_index: the index of the nearest training data point for each training sample (Mx1).
% - accuracy: if the testing labels are supported, the accuracy of
% the classification is returned, otherwise it will be zero.
%Author: Mahmoud Afifi - York University
%checks
if nargin < 4
error('Too few input arguments.')
elseif nargin < 5
t_labels=[];
accuracy=0;
end
if size(data,2)~=size(t_data,2)
error('data should have the same dimensionality');
end
if mod(k,2)==0
error('to reduce the chance of ties, please choose odd k');
end
%initialization
predicted_labels=zeros(size(t_data,1),1);
ed=zeros(size(t_data,1),size(data,1)); %ed: (MxN) euclidean distances
ind=zeros(size(t_data,1),size(data,1)); %corresponding indices (MxN)
k_nn=zeros(size(t_data,1),k); %k-nearest neighbors for testing sample (Mxk)
%calc euclidean distances between each testing data point and the training
%data samples
for test_point=1:size(t_data,1)
for train_point=1:size(data,1)
%calc and store sorted euclidean distances with corresponding indices
ed(test_point,train_point)=sqrt(...
sum((t_data(test_point,:)-data(train_point,:)).^2));
end
[ed(test_point,:),ind(test_point,:)]=sort(ed(test_point,:));
end
%find the nearest k for each data point of the testing data
k_nn=ind(:,1:k);
nn_index=k_nn(:,1);
%get the majority vote
for i=1:size(k_nn,1)
options=unique(labels(k_nn(i,:)'));
max_count=0;
max_label=0;
for j=1:length(options)
L=length(find(labels(k_nn(i,:)')==options(j)));
if L>max_count
max_label=options(j);
max_count=L;
end
end
predicted_labels(i)=max_label;
end
%calculate the classification accuracy
if isempty(t_labels)==0
accuracy=length(find(predicted_labels==t_labels))/size(t_data,1);
end
2 件のコメント
Christopher McCausland
2023 年 3 月 13 日
Essrra,
Please edit this to ask a question. No one can help if they don't know what you want/need help with.
Christopher
Rik
2023 年 3 月 14 日
編集済み: Rik
2023 年 3 月 14 日
Also, did Mahmoud Afifi agree to have his code posted here?
If you have trouble with Matlab basics you may consider doing the Onramp tutorial (which is provided for free by Mathworks).
回答 (0 件)
この質問は閉じられています。
参考
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!