How to use softmax, Loss function(negative log probability) in classification

5 ビュー (過去 30 日間)
Kong
Kong 2020 年 4 月 2 日
回答済み: Shishir Singhal 2020 年 4 月 7 日
Hello.
I want to classify videos.
After computation of eucldean distance, I want to use softmax and Loss function(negative log probability) for classification.
Can I get some idea to make the code?
clear all
close all
data = csvread('outfile.csv');
values = data(:,1:end-1);
labels = data(:,end);
avg = splitapply(@(x) mean(x,1), values, labels+1);
mean_class1 = avg(1,:);
mean_class2 = avg(2,:);
mean_class3 = avg(3,:);
mean_class4 = avg(4,:);
mean_class5 = avg(5,:);
bend_query = values(1,:);
run_query = values(2,:);
walk_query = values(3,:);
skip_query = values(4,:);
wave_query = values(5,:);
% calculate euclidean distance
euclidean_bend = pdist2(mean_class1, bend_query, 'euclidean');
euclidean_run = pdist2(mean_class2, run_query, 'euclidean');
euclidean_walk = pdist2(mean_class3, walk_query, 'euclidean');
euclidean_skip = pdist2(mean_class4, skip_query, 'euclidean');
euclidean_wave = pdist2(mean_class5, wave_query, 'euclidean');

採用された回答

Shishir Singhal
Shishir Singhal 2020 年 4 月 7 日
For classification,
softmax creates probability scores for each category.
since your predictions and targets follows different probability distributions. You can use cross entropy loss for that. It is kind of negative log probability function.
Refer to this documentation for the implementation: https://www.mathworks.com/help/deeplearning/ref/dlarray.crossentropy.html

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeStatistics and Machine Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by