Problem with resNet18 on multi-spectral image segmentation

8 ビュー (過去 30 日間)
文文
文文 2023 年 12 月 13 日
コメント済み: 文文 2023 年 12 月 15 日
Hi, i was trying to deploy multi-spectral image segmentation with resnet18 refering to this answer resnet50-on-multi-spectral-image-segmentation. When it trains with rgb images, the performance is not bad. Then i change the images into multi-spectral images(array size are 1024*1024*151), but i got the following error "Invalid training data. The output size (3) of the last layer does not match the number of classes of the responses (1)."
Can anyone help me on that?
imageDir = 'C:\dataset\train_images';
imds = MySequenceDatastore(imageDir);
classNames = ["norm","low","high"];
labelDir = 'C:\dataset\PixelLabelData';
labelIDs = [0,1,2];
pxds = pixelLabelDatastore(labelDir,classNames,labelIDs);
imageSize = [1024 1024 3];
N = 151;
numClasses = numel(classNames);
lgraph = deeplabv3plusLayers(imageSize, numClasses, "resnet18");
analyzeNetwork(lgraph)
layers = lgraph.Layers;
newlgraph = replaceLayer(lgraph,'data',imageInputLayer([1024 1024 N],'Name','input'));
newlgraph = replaceLayer(newlgraph,'conv1',convolution2dLayer(7,64,'stride',[2 2],'padding',[3 3 3 3],'Name','conv1'));
analyzeNetwork(newlgraph)
trainData = combine(imds,pxds);
opts = trainingOptions('sgdm',...
"ExecutionEnvironment","gpu",...
"InitialLearnRate",0.001,...
'MiniBatchSize',16,...
"Plots","training-progress",...
'MaxEpochs',30);
[net,info] = trainNetwork(trainData,newlgraph,opts);
%% function MySequenceDatastore
classdef MySequenceDatastore < matlab.io.Datastore & ...
matlab.io.datastore.MiniBatchable
properties
Datastore
Labels
NumClasses
SequenceDimension
MiniBatchSize
end
properties(SetAccess = protected)
NumObservations
end
properties(Access = private)
% This property is inherited from Datastore
CurrentFileIndex
end
methods
function ds = MySequenceDatastore(folder)
% Construct a MySequenceDatastore object
% Create a file datastore. The readSequence function is
% defined following the class definition.
fds = fileDatastore(folder, ...
'ReadFcn',@readSequence, ...
'IncludeSubfolders',true);
ds.Datastore = fds;
% Read labels from folder names
numObservations = numel(fds.Files);
for i = 1:numObservations
file = fds.Files{i};
filepath = fileparts(file);
[~,label] = fileparts(filepath);
labels{i,1} = label;
end
ds.Labels = categorical(labels);
ds.NumClasses = numel(unique(labels));
% Determine sequence dimension. When you define the LSTM
% network architecture, you can use this property to
% specify the input size of the sequenceInputLayer.
X = preview(fds);
ds.SequenceDimension = size(X,1);
% Initialize datastore properties.
ds.MiniBatchSize = 1; %128
ds.NumObservations = numObservations;
ds.CurrentFileIndex = 1;
end
function dsNew = shuffle(ds)
% dsNew = shuffle(ds) shuffles the files and the
% corresponding labels in the datastore.
% Create a copy of datastore
dsNew = copy(ds);
dsNew.Datastore = copy(ds.Datastore);
fds = dsNew.Datastore;
% Shuffle files and corresponding labels
numObservations = dsNew.NumObservations;
idx = randperm(numObservations);
fds.Files = fds.Files(idx);
dsNew.Labels = dsNew.Labels(idx);
end
function tf = hasdata(ds)
% Return true if more data is available
tf = ds.CurrentFileIndex + ds.MiniBatchSize - 1 ...
<= ds.NumObservations;
end
function [data,info] = read(ds)
% Read one mini-batch batch of data
miniBatchSize = ds.MiniBatchSize;
info = struct;
for i = 1:miniBatchSize
predictors{i,1} = read(ds.Datastore);
responses(i,1) = ds.Labels(ds.CurrentFileIndex);
ds.CurrentFileIndex = ds.CurrentFileIndex + 1;
end
data = preprocessData(ds,predictors,responses);
end
function data = preprocessData(ds,predictors,responses)
% data = preprocessData(ds,predictors,responses) preprocesses
% the data in predictors and responses and returns the table
% data
miniBatchSize = ds.MiniBatchSize;
% Pad data to length of longest sequence.
sequenceLengths = cellfun(@(X) size(X,2),predictors);
maxSequenceLength = max(sequenceLengths);
for i = 1:miniBatchSize
X = predictors{i};
% Pad sequence with zeros.
if size(X,2) < maxSequenceLength
X(:,maxSequenceLength) = 0;
end
predictors{i} = X;
end
% Return data as a table.
data = table(predictors,responses);
end
function reset(ds)
% Reset to the start of the data
reset(ds.Datastore);
ds.CurrentFileIndex = 1;
end
end
methods (Hidden = true)
function frac = progress(ds)
% Determine percentage of data read from datastore
frac = (ds.CurrentFileIndex - 1) / ds.NumObservations;
end
end
  2 件のコメント
Matt J
Matt J 2023 年 12 月 13 日
It would be easiest if you just attached newlgraph and trainData in a .mat file
Matt J
Matt J 2023 年 12 月 13 日
Also, attach an instance of training and response data by using trainData.read().

サインインしてコメントする。

回答 (1 件)

Vinayak Choyyan
Vinayak Choyyan 2023 年 12 月 14 日
Hi,
From a quick look at the code, it looks like your custom datastore is returning a table of predictors and responses. Later you are then combining imds, the custom datastore, with pxds, a pixelLabelDatastore, which is also a response datastore. This might be unintended and the reason for the error.
The error message you are getting comes when the model you are trying to train expects a certain number of classes (3 in your case) but the data passed to it for training has only one class. I suggest using read(trainData) first to check if your training datastore is indeed reading what you expected it to show.
  9 件のコメント
Vinayak Choyyan
Vinayak Choyyan 2023 年 12 月 15 日
This seems like a data issue. There is a mismatch in array size. The basic issue here is that when trainNetwork is trying to read all the data, some hyperspectral cubes are giving different sizes. You will have to check this out yourself as I do not have the data.
文文
文文 2023 年 12 月 15 日
I totally understand what u mean. It would be even better if u could provide some ideas to check the data.
Thank u for your help in these days.

サインインしてコメントする。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by