Out of memory while training
1 回表示 (過去 30 日間)
古いコメントを表示
layers = [ ...
imageInputLayer([768 1024 3])
convolution2dLayer(5,20)
reluLayer
maxPooling2dLayer(2,'Stride',2)
fullyConnectedLayer(2)
softmaxLayer
classificationLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs',20,...
'InitialLearnRate',1e-4, ...
'Verbose',false, ...
'Plots','training-progress');
imds = imageDatastore('datasets/train', ...
'IncludeSubfolders',true, ...
'LabelSource','foldernames');
inputSize = [768 1024];
imds.ReadFcn = @(loc)imresize(imread(loc),inputSize);
numTrainingFiles = 51;
[imdsTrain,imdsTest] = splitEachLabel(imds,numTrainingFiles,'randomize');
net = trainNetwork(imdsTrain,layers,options);
save('net.mat','net');
I have the code above train a model using set of images. I give file location "train" and it contains two subfile which include training images. Problem is that when I start training, it gives out of memory error and it just has 100 image in it. When I train one subfile only, it does it flawlessly. Error may happen because of the following line.
inputSize = [768 1024];
imds.ReadFcn = @(loc)imresize(imread(loc),inputSize);
0 件のコメント
回答 (1 件)
Gaurav Garg
2019 年 11 月 20 日
Hi,
Consider using ‘MiniBatchSize’ and 'ExecutionEnvironment' as arguments in trainingOptions.
Using ‘MiniBatchSize’, you can specify the size of mini-batch you wish to use for training. You can try using size as 10 in your case.
Using 'ExecutionEnvironment', you can set the environment to CPU or GPU or auto (to train on a GPU, if available).
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Image Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!