フィルターのクリア

How do I differentiate which pixels are classified by the LDA?

1 回表示 (過去 30 日間)
Lester Lim
Lester Lim 2013 年 1 月 28 日
Code to activate LDA:
data = importdata('LDA data.mat')
features=data(:,1:end-1); %split data without labels
lable=data(:,end); %get the labels
W=LDA(features,lable); %perform LDA on data
L = [ones(170884,1) features] * W';
P = exp(L) ./ repmat(sum(exp(L),1),[170884 1]);
handles.features = features;
guidata(hObject, handles);
========================================================================
LDA code:
% LDA - MATLAB subroutine to perform linear discriminant analysis
% by Will Dwinnell and Deniz Sevis
%
% Use:
% W = LDA(Input,Target,Priors)
%
% W = discovered linear coefficients (first column is the constants)
% Input = predictor data (variables in columns, observations in rows)
% Target = target variable (class labels)
% Priors = vector of prior probabilities (optional)
%
% Note: discriminant coefficients are stored in W in the order of unique
(Target)
%
% Example:
%
% % Generate example data: 2 groups, of 10 and 15, respectively
% X = [randn(10,2); randn(15,2) + 1.5]; Y = [zeros(10,1); ones(15,1)];
%
% % Calculate linear discriminant coefficients
% W = LDA(X,Y);
%
% % Calulcate linear scores for training data
% L = [ones(25,1) X] * W';
%
% % Calculate class probabilities
% P = exp(L) ./ repmat(sum(exp(L),2),[1 2]);
%
%
% Last modified: Dec-11-2010
function W = LDA(Input,Target,Priors)
% Determine size of input data
[n m] = size(Input);
% Discover and count unique class labels
ClassLabel = unique(Target);
k = length(ClassLabel);
% Initialize
nGroup = NaN(k,1); % Group counts
GroupMean = NaN(k,m); % Group sample means
PooledCov = zeros(m,m); % Pooled covariance
W = NaN(k,m+1); % model coefficients
if (nargin >= 3) PriorProb = Priors; end
% Loop over classes to perform intermediate calculations
for i = 1:k,
% Establish location and size of each class
Group = (Target == ClassLabel(i));
nGroup(i) = sum(double(Group));
% Calculate group mean vectors
GroupMean(i,:) = mean(Input(Group,:));
% Accumulate pooled covariance information
PooledCov = PooledCov + ((nGroup(i) - 1) / (n - k) ).* cov(Input(Group,:));
end
% Assign prior probabilities
if (nargin >= 3)
% Use the user-supplied priors
PriorProb = Priors;
else
% Use the sample probabilities
PriorProb = nGroup / n;
end
% Loop over classes to calculate linear discriminant coefficients
for i = 1:k,
% Intermediate calculation for efficiency
% This replaces: GroupMean(g,:) * inv(PooledCov)
Temp = GroupMean(i,:) / PooledCov;
% Constant
W(i,1) = -0.5 * Temp * GroupMean(i,:)' + log(PriorProb(i));
% Linear
W(i,2:end) = Temp;
end
% Housekeeping
clear Temp
end
% EOF
Here's the catch, when I add all the P's together, it dosen't give 1 instead it gives exponential values that are to the power of negative 100+. When I remove the exponentials, the values are still too small. Can anyone point out what's wrong? How do I get labels from this?
  2 件のコメント
Lester Lim
Lester Lim 2013 年 1 月 28 日
Apologise for the multiple postings of similar questions, desparate to get the supervised learning done cause deadline nearing. Seek your understanding...

サインインしてコメントする。

採用された回答

Walter Roberson
Walter Roberson 2013 年 1 月 28 日
If I trace correctly, L has multiple rows and columns. sum() of an array with multiple rows and columns defaults to summing the columns. I suspect you want to sum the rows. That would be sum(L,2)
  10 件のコメント
Walter Roberson
Walter Roberson 2013 年 1 月 29 日
Is "features" 170884 by 6?
I do not see any reason, from those values, to expect that L should sum to any particular value ?
Lester Lim
Lester Lim 2013 年 1 月 29 日
編集済み: Lester Lim 2013 年 1 月 29 日
Its a 170884 by 21 double. L is 170884 by 7 double. But the main point is that I can't tell which probability has a higher chance of getting the label...

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeStatistics and Machine Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by