How to Multiple output regression

70 ビュー (過去 30 日間)
jaehong kim
jaehong kim 2021 年 2 月 12 日
編集済み: KSSV 2022 年 9 月 21 日
I want to know how to custom regression training loop (multiple output).
I want to get simple code example about custom multiple output regression.
Every ex is about cnn but i just desire DNN Thank u for reading my question :)

回答 (1 件)

Raynier Suresh
Raynier Suresh 2021 年 2 月 17 日
The below code will give you an example on how to create and train a custom network with multiple regression output.
%% Create the network with multiple output
layers = [imageInputLayer([28 28 1],'Normalization','none','Name','in')
fullyConnectedLayer(1,'Name','fc1')];
lgraph = layerGraph(layers);
lgraph = addLayers(lgraph,fullyConnectedLayer(1,'Name','fc2'));
lgraph = connectLayers(lgraph,'in','fc2');
figure
plot(lgraph)
dlnet = dlnetwork(lgraph);
%% Training Data
XTrain = rand(28,28,1,50);% Input data (50 images of size 28x28x1)
YTrain1 = randi(10,50,1); % Regression Output data for Output 1
YTrain2 = randi(10,50,1); % Regression Output data for Output 2
dsXTrain = arrayDatastore(XTrain,'IterationDimension',4);
dsYTrain1 = arrayDatastore(YTrain1);
dsYTrain2 = arrayDatastore(YTrain2);
dsTrain = combine(dsXTrain,dsYTrain1,dsYTrain2);
%% Train the Network
numEpochs = 3;
miniBatchSize = 128;
plots = "training-progress";
mbq = minibatchqueue(dsTrain,'MiniBatchSize',miniBatchSize,'MiniBatchFcn', @preprocessData,'MiniBatchFormat',{'SSCB','',''});
if plots == "training-progress"
figure
lineLossTrain = animatedline('Color',[0.85 0.325 0.098]);
ylim([0 inf])
xlabel("Iteration");ylabel("Loss");grid on
end
trailingAvg = [];
trailingAvgSq = [];
iteration = 0;
start = tic;
% Loop over epochs.
for epoch = 1:numEpochs
% Shuffle data.
shuffle(mbq)
% Loop over mini-batches
while hasdata(mbq)
iteration = iteration + 1;
[dlX,dlY1,dlY2] = next(mbq);
% Evaluate the model gradients, state, and loss using dlfeval and the
% modelGradients function.
[gradients,state,loss] = dlfeval(@modelGradients, dlnet, dlX, dlY1, dlY2);
dlnet.State = state;
% Update the network parameters using the Adam optimizer.
[dlnet,trailingAvg,trailingAvgSq] = adamupdate(dlnet,gradients,trailingAvg,trailingAvgSq,iteration);
% Display the training progress.
if plots == "training-progress"
D = duration(0,0,toc(start),'Format','hh:mm:ss');
addpoints(lineLossTrain,iteration,double(gather(extractdata(loss))))
title("Epoch: " + epoch + ", Elapsed: " + string(D))
drawnow
end
end
end
%% Necessary function to train the network
function [gradients,state,loss] = modelGradients(dlnet,dlX,T1,T2)
[dlY1,dlY2,state] = forward(dlnet,dlX,'Outputs',["fc1" "fc2"]);
lossT1 = mse(dlY1,T1);
lossT2 = mse(dlY2,T2);
loss = 0.1*lossT1 + 0.1*lossT2;
gradients = dlgradient(loss,dlnet.Learnables);
end
function [X,Y1,Y2] = preprocessData(XCell,Y1Cell,Y2Cell)
X = cat(4,XCell{:});
Y1 = cat(2,Y1Cell{:});
Y2 = cat(2,Y2Cell{:});
end
For more information you can refer the below links
  2 件のコメント
Cheng Qiu
Cheng Qiu 2021 年 9 月 28 日
編集済み: Cheng Qiu 2021 年 9 月 28 日
I always got error indicating that the the value of the derivative must be a scalar dlarray with trace. How to fix it? Below is my code.
Cheng Qiu
Cheng Qiu 2021 年 9 月 28 日
編集済み: KSSV 2022 年 9 月 21 日
layers = [
imageInputLayer([32 1],'Name','input','Normalization','none')
fullyConnectedLayer(Nhide,"Name","FC1")
reluLayer("Name","Relu1")
fullyConnectedLayer(Nhide,"Name","FC2")
dropoutLayer(0.5,"Name","DO")
fullyConnectedLayer(outputSize,"Name","FC3")
reluLayer('Name','Relu2')];
lgraph = layerGraph(layers);
dlnet = dlnetwork(lgraph);
% Training Option
numEpochs = 1e3;
miniBatchSize = 32;
initialLearnRate = 0.001;
decay = 0.01;
momentum = 0.9;
plots = "training-progress";
executionEnvironment = "auto";
if plots == "training-progress"
figure
lineLossTrain = animatedline('Color',[0.85 0.325 0.098]);
ylim([0 inf])
xlabel("Iteration")
ylabel("Loss")
grid on
end
%% Training The Network
numObservations = numel(output);
numIterationsPerEpoch = floor(numObservations./miniBatchSize);
iteration = 0;
start = tic;
% Loop over epochs.
for epoch = 1:numEpochs
% Shuffle data.
idx = randperm(numel(OutPower(:,1)));
input = input(:,:,:,idx);
output = output(:,idx);
% Loop over mini-batches.
for i = 1:numIterationsPerEpoch
iteration = iteration + 1;
% Read mini-batch of data and convert the labels to dummy
% variables.
idx = (i-1)*miniBatchSize+1:i*miniBatchSize;
X = input(:,:,:,idx);
Y1 = output(1,idx);
Y2 = output(2,idx);
Y3 = output(3,idx);
Y4 = output(4,idx);
% Convert mini-batch of data to dlarray.
dlX = dlarray(X,'SSCB');
dlY1= dlarray(Y1,'SB');
dlY2= dlarray(Y2,'SB');
dlY3= dlarray(Y3,'SB');
dlY4= dlarray(Y4,'SB');
% dlY = dlarray(Y,'SSCB');
% If training on a GPU, then convert data to gpuArray.
if (executionEnvironment == "auto" && canUseGPU) || executionEnvironment == "gpu"
dlX = gpuArray(dlX);
end
% Evaluate the model gradients, state, and loss using dlfeval and the
% modelGradients function and update the network state.
[gradients,state,loss] = dlfeval(@modelGradients,dlnet,dlX,dlY1,dlY2,dlY3,dlY4);
dlnet.State = state;
function [gradients,state,loss] = modelGradients(dlnet,dlX,Y1,Y2,Y3,Y4)
[dlYPred,state] = forward(dlnet,dlX);
loss = sqrt((dlYPred(1)-Y1).^2+(dlYPred(2)-Y2).^2+(dlYPred(3)-Y3).^2+(dlYPred(4)-Y4).^2)/2;
gradients = dlgradient(loss,dlnet.Learnables);
end

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeCustom Training Loops についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by