Matlab function trainNetwork doesn't generate training progress plot

10 ビュー (過去 30 日間)
Heather Riley
Heather Riley 2020 年 1 月 17 日
回答済み: Hiro Yoshino 2020 年 1 月 17 日
I'm trying to see the training progress plot of my CNN, but when I run my script it just ends without producing any output (no figure, no output text). What am I doing wrong?
My code is below, the section labeled %% TRAIN THE NEURAL NETWORK is where I think the issue is.
function segmentation_neural_network()
%% FETCH AND PARSE DATASET
% Set datapath
datapath = 'D:\20190618-f1\images_extracted_from_zebrafish_movies\20190618-f1_10-27-56\cropped_and_rotated';
training_datapath = strcat(datapath,'\training_dataset');
testing_datapath = strcat(datapath,'\testing_dataset');
% Get training and testing datasets
training_dataset = imageDatastore(strcat(training_datapath,'\images_rgb'));
testing_dataset = imageDatastore(strcat(testing_datapath,'\images_rgb'));
% Get pixel map labels
load(strcat(training_datapath,'\pixel_maps\eye_segmentation_labels.mat'));
labels = pixelLabelDatastore(gTruth);
% Weight segmentation class importance by the number of pixels in each class
pixel_count = countEachLabel(labels); % count number of each type of pixel
frequency = pixel_count.PixelCount ./ pixel_count.ImagePixelCount; % calculate pixel type frequencies
class_weights = mean(frequency) ./ frequency; % create class weights that balance the loss function so that more common pixel types won't be preferred
%% CREATE THE NEURAL NETWORK
% Specify the input image size.
imageSize = [64 64 3];
% Specify the number of classes.
numClasses = 2; % eye, not eye
% Create DeepLab v3+.
lgraph = helperDeeplabv3PlusResnet18(imageSize, numClasses);
% Replace the network's classification layer with a pixel classification
% layer that uses class weights to balance the loss function
pxLayer = pixelClassificationLayer('Name','labels','Classes',pixel_count.Name,'ClassWeights',class_weights);
lgraph = replaceLayer(lgraph,"classification",pxLayer);
%% TRAIN THE NEURAL NETWORK
% Training hyper-parameters: edit these settings to fine-tune the network
options = trainingOptions('sgdm', 'LearnRateSchedule','piecewise', 'LearnRateDropPeriod',10, 'LearnRateDropFactor',0.3, 'Momentum',0.9, 'InitialLearnRate',1e-3, 'L2Regularization',0.005, 'MaxEpochs',30, 'MiniBatchSize',1, 'Shuffle','every-epoch', 'CheckpointPath','D:\20190618-f1\nn_checkpoints', 'Verbose',true, 'VerboseFrequency',2, 'Plots','training-progress')
% Set up data augmentation to enhance training dataset
augmenter = imageDataAugmenter('RandXReflection',true, 'RandXTranslation',[-10 10],'RandYTranslation',[-10 10]);
% Combine augmented data with training data
augmented_training_dataset = pixelLabelImageDatastore(training_dataset(1:50), labels(1:50), 'DataAugmentation',augmenter);
% Train the network
[eye_segmentation_nn, info] = trainNetwork(augmented_training_dataset,lgraph,options);
end

回答 (1 件)

Hiro Yoshino
Hiro Yoshino 2020 年 1 月 17 日
How about running this script outside of the function? see what will happen?

製品


リリース

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by