# How to apply majority voting for classification ensemble in Matlab?

38 ビュー (過去 30 日間)
phdcomputer Eng 2019 年 5 月 4 日

I have five classifiers SVM, random forest, naive Bayes, decision tree, KNN,I attached my Matlab code. I want to combine the results of these five classifiers on a dataset by using majority voting method and I want to consider all these classifiers have the same weight. because the number of the tests is calculated 5 so the output of each classifier is 5 labels(class labels in this example is 1 or 2). I'll be gratefull to have your opinions
clear all
close all
clc
data=data;
[n,m]=size(data);
rows=(1:n);
test_count=floor((1/6)*n);
sum_ens=0;sum_result=0;
test_rows=randsample(rows,test_count);
train_rows=setdiff(rows,test_rows);
test=data(test_rows,:);
train=data(train_rows,:);
xtest=test(:,1:m-1);
ytest=test(:,m);
xtrain=train(:,1:m-1);
ytrain=train(:,m);
%-----------svm------------------
svm=svm1(xtest,xtrain,ytrain);
%-------------random forest---------------
rforest=randomforest(xtest,xtrain,ytrain);
%-------------decision tree---------------
DT=DTree(xtest,xtrain,ytrain);
%---------------bayesian---------------------
NBModel = NaiveBayes.fit(xtrain,ytrain, 'Distribution', 'kernel');
Pred = NBModel.predict(xtest);
dt=Pred;
%--------------KNN----------------
knnModel=fitcknn(xtrain,ytrain,'NumNeighbors',4);
pred=knnModel.predict(xtest);
sk=pred;
how can I apply majority voting directly on these outputs of classifiers in Matlab?
Thanks very much

サインインしてコメントする。

### 採用された回答

Ahmad Obeid 2019 年 5 月 24 日

I don't think that there's an existing function that does that for you, so you have to build your own. Here is a suggested method:
• Assuming you have your five prediction arrays from your five different classifiers, and
• all prediction arrays have the same size = length(test_rows), and
• you have 2 classes: 1 & 2, you can do the following:
% First we concatenate all prediciton arrays into one big matrix.
% Make sure that all prediction arrays are of the same type, I am assumming here that they
% are type double. I am also assuming that all prediction arrays are column vectors.
Prediction = [svm,rforest,DTree,dt,sk];
Final_decision = zeros(length(test_rows),1);
all_results = [1,2]; %possible outcomes
for row = 1:length(test_rows)
election_array = zeros(1,2);
for col = 1:5 %your five different classifiers
election_array(Prediction(row,col)) = ...
election_array(Prediction(row,col)) + 1;
end
[~,I] = max(election_array);
Final_decision(row) = all_results(I);
end
Hope this helps.
##### 7 件のコメント5 件の古いコメントを表示5 件の古いコメントを非表示
Mona Al-Kharraz 2020 年 4 月 4 日
Thanks Mr. Ahmad it is working for me.
doaa khalil 2020 年 8 月 12 日
hi Mr. Ahmed i want to apply majority voting for two classification model .Can you help me?
unzip('Preprocessing.zip');
imds = imageDatastore('Preprocessing', ...
'IncludeSubfolders',true, ...
'LabelSource','foldernames');
%Use countEachLabel to summarize the number of images per category.
tbl1 = countEachLabel(imds)
%Divide the data into training and validation data sets
rng('default') % For reproduciblity
[trainingSet, testSet] = splitEachLabel(imds, 0.3, 'randomize');
net1 = alexnet;
% Inspect the first layer
net1.Layers(1)
% Create augmentedImageDatastore from training and test sets to resize
% images in imds to the size required by the network1.
imageSize1 = net1.Layers(1).InputSize;
augmentedTrainingSet1 = augmentedImageDatastore(imageSize1, trainingSet, 'ColorPreprocessing', 'gray2rgb');
augmentedTestSet1 = augmentedImageDatastore(imageSize1, testSet, 'ColorPreprocessing', 'gray2rgb');
featureLayer1 = 'fc7';
trainingFeatures1 = activations(net1, augmentedTrainingSet1, featureLayer1, ...
'MiniBatchSize', 32, 'OutputAs', 'columns');
net2 = resnet50;
% Visualize the first section of the network.
figure
plot(net2)
title('First section of ResNet-50')
set(gca,'YLim',[150 170]);
% Inspect the first layer
net2.Layers(1)
% Create augmentedImageDatastore from training and test sets to resize
% images in imds to the size required by the network2.
imageSize2 = net2.Layers(1).InputSize;
augmentedTrainingSet2 = augmentedImageDatastore(imageSize2, trainingSet, 'ColorPreprocessing', 'gray2rgb');
augmentedTestSet2 = augmentedImageDatastore(imageSize2, testSet, 'ColorPreprocessing', 'gray2rgb');
featureLayer2 = 'fc1000';
trainingFeatures2 = activations(net2, augmentedTrainingSet2, featureLayer2, ...
'MiniBatchSize', 32, 'OutputAs', 'columns');
%% %% Train A Multiclass SVM Classifier Using CNN1 Features
% Get training labels from the trainingSet
trainingLabels = trainingSet.Labels;
% Train multiclass SVM classifier using a fast linear solver, and set
% 'ObservationsIn' to 'columns' to match the arrangement used for training
% features.
classifier1 = fitcecoc(newfeatures, trainingLabels, ...
'Learners', 'Linear', 'Coding', 'onevsall', 'ObservationsIn', 'columns');
% Extract test features using the CNN1
testFeatures1 = activations(net1, augmentedTestSet1, featureLayer1, ...
'MiniBatchSize', 32, 'OutputAs', 'columns');
% Pass CNN image features to trained classifier
predictedLabels1 = predict(classifier1, testFeatures1, 'ObservationsIn', 'columns');

サインインしてコメントする。

### その他の回答 (3 件)

Li Ai 2021 年 3 月 30 日
I think just put the outputs of five models together as a matrix, then use mode function
##### 0 件のコメント-2 件の古いコメントを表示-2 件の古いコメントを非表示

サインインしてコメントする。

Sinan Islam 2020 年 12 月 12 日
Matlab should consider adding vote ensemble.
##### 0 件のコメント-2 件の古いコメントを表示-2 件の古いコメントを非表示

サインインしてコメントする。

Abida Ashraf 2019 年 10 月 16 日
How three classifer like fitcnb,fitcecoc and fitensemble can be used to get average results.

サインインしてコメントする。

### カテゴリ

Help Center および File ExchangeClassification Ensembles についてさらに検索

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by