フィルターのクリア

Output Function to Save Net on Every Validation

11 ビュー (過去 30 日間)
Grant Anderson
Grant Anderson 2020 年 5 月 6 日
コメント済み: Ameer Hamza 2020 年 5 月 12 日
I'm curious if it's possible to define an output function to spit out the current state of the network while training by using an output function to put that current net into a structure in the same way I have it defined to spit out [net,tr] = trainNetwork() when it finishes, but does so during training.
I can't use checkpoints because I am using an ADAM solver for my network.
1: Net,TR
2: Net, TR
3: Net, TR
4: Net, TR
etc.
  1 件のコメント
Ameer Hamza
Ameer Hamza 2020 年 5 月 6 日
It seems that the outputFcn cannot save the network itself after each iteration. Is saving just the state of network training enough?

サインインしてコメントする。

回答 (1 件)

Ameer Hamza
Ameer Hamza 2020 年 5 月 6 日
編集済み: Ameer Hamza 2020 年 5 月 6 日
If you just want to save the training states, then try the following example. It is adapted from this example: https://www.mathworks.com/help/releases/R2020a/deeplearning/ref/trainingoptions.html#bvniuj4
[XTrain,YTrain] = digitTrain4DArrayData;
idx = randperm(size(XTrain,4),1000);
XValidation = XTrain(:,:,:,idx);
XTrain(:,:,:,idx) = [];
YValidation = YTrain(idx);
YTrain(idx) = [];
layers = [
imageInputLayer([28 28 1])
convolution2dLayer(3,8,'Padding','same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,16,'Padding','same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,32,'Padding','same')
batchNormalizationLayer
reluLayer
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs',8, ...
'ValidationData',{XValidation,YValidation}, ...
'ValidationFrequency',30, ...
'Verbose',false, ...
'Plots','training-progress', ...
'OutputFcn', @outFcn);
global training_state
training_state = [];
net = trainNetwork(XTrain,YTrain,layers,options);
function stop = outFcn(info)
global training_state
training_state = [training_state info];
stop = false;
end
Use of the global variable can be avoided if you define your own handle class and pass it to the outFcn. However, if you are fine with the use of global, then it shouldn't be an issue.
  4 件のコメント
Grant Anderson
Grant Anderson 2020 年 5 月 11 日
編集済み: Grant Anderson 2020 年 5 月 11 日
This is the function in which I feed in some parameters to train a neural network. It allows me to re-size the fully-connected-layers and neurons.
function [net,tr] = betNet(X,y,X_test,y_test,X_cv,y_cv,maxE,NHL,fcls)
%% ===== Setting up DNN =====
%Sets up our FCL
fcl1 = fullyConnectedLayer(fcls,'BiasInitializer','narrow-normal');
fcl2 = fullyConnectedLayer(2,'BiasInitializer','ones');
ip = sequenceInputLayer(size(X,1),'Normalization','zerocenter');
sml = softmaxLayer('Name','sml');
options = trainingOptions('adam',...
'MaxEpochs',maxE,...
'ExecutionEnvironment','gpu',...
'Shuffle','every-epoch',...
'MiniBatchSize',64,...
'ValidationFrequency',50,...
'ValidationData',{X_cv,y_cv},...
'OutputFcn', @outFcn)
layers = [ip repmat(fcl1,1,NHL) fcl2 softmaxLayer classificationLayer];
%
global training_state
training_state = [];
%% ===== Training NN =====
[net,tr] = trainNetwork(X,y,layers,options);
function stop = outFcn(info)
global training_state
training_state = [training_state info];
stop = false;
end
end
Ameer Hamza
Ameer Hamza 2020 年 5 月 12 日
If you want to check the value of training_state in the base workspace after the execution of your function, then you should also run the following line in the command window before calling your function.
global training_state

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by