Multi objects detection problems - YOLOv2

2 ビュー (過去 30 日間)
Oscar Lema
Oscar Lema 2020 年 3 月 25 日
コメント済み: Oscar Lema 2020 年 11 月 22 日
Hi,
I would like to use YOLOv2, for detecting differents classes (20 to be exact, but I'm going to start with 2): airplane and ship. When I train with just one class there is no problem, I can detect all airplane testing images. The problem is when I add a second class (ship). When I add this class I cant't detect airplanes or ships. It trains but no detects. Do you know why?
I'm using 854 images for airplanes and 1701 for ships.
I've following the official tutorial: https://www.mathworks.com/help/deeplearning/ug/object-detection-using-yolo-v2.html, but it only use one class, as all example I've found.
Here is my code:
inputSize = [400 400 3];
doTraining = true;
classes = ["airplane","ship"];
pathToImages = 'path';
images = imageDatastore(pathToImages, 'IncludeSubfolders',true);
annotations = images.Files(1:end);
for i=1:length(annotations)
file = strrep(char(annotations(i)),"images","annotations");
file = strrep(file,"jpg","txt");
class = split(file,"\");
position = (find(contains(classes,class))) + 1;
annotations(i,position) = {load(file)};
end
annotations = cell2table(annotations,'VariableNames',{'imageFilename' 'airplane' 'ship'});
rng(0);
shuffledIndices = randperm(height(annotations));
idx = floor(0.6 * length(shuffledIndices) );
trainingIdx = 1:idx;
trainingDataTbl = annotations(shuffledIndices(trainingIdx),:);
validationIdx = idx+1 : idx + 1 + floor(0.1 * length(shuffledIndices) );
valDataTbl = annotations(shuffledIndices(validationIdx),:);
imdsTrain = imageDatastore(trainingDataTbl{:,'imageFilename'});
bldsTrain = boxLabelDatastore(trainingDataTbl(:,2:end));
imdsVal = imageDatastore(valDataTbl{:,'imageFilename'});
bldsVal = boxLabelDatastore(valDataTbl(:,2:end));
trainingData = combine(imdsTrain,bldsTrain);
valData = combine(imdsVal,bldsVal);
data = read(trainingData);
I = data{1};
bbox = data{2};
annotatedImage = insertShape(I,'Rectangle',bbox);
annotatedImage = imresize(annotatedImage,2);
figure
imshow(annotatedImage)
numClasses = length(classes);
trainingDataForEstimation = transform(trainingData,@(data)preprocessData(data,inputSize));
numAnchors = 7;
[anchorBoxes, meanIoU] = estimateAnchorBoxes(trainingData, numAnchors)
featureExtractionNetwork = resnet50;
featureLayer = 'activation_40_relu';
lgraph = yolov2Layers(inputSize,numClasses,anchorBoxes,featureExtractionNetwork,featureLayer);
augmentedTrainingData = transform(trainingData,@augmentData);
preprocessedTrainingData = transform(augmentedTrainingData,@(data)preprocessData(data,inputSize));
data = read(preprocessedTrainingData);
options = trainingOptions('sgdm',...
'MiniBatchSize', 16,...
'InitialLearnRate',1e-3,...
'MaxEpochs',20,...
'CheckpointPath',tempdir,...
'Shuffle','never');
if doTraining
% Train the YOLO v2 detector.
[detector,info] = trainYOLOv2ObjectDetector(preprocessedTrainingData,lgraph,options);
else
pretrained = load('yolov2ResNet50VehicleExample_19b.mat');
detector = pretrained.detector;
end
% Detector that not detects
I = imread('pathToAirPlaneTestImage);
[bboxes,scores] = detect(detector,I);
if ~isempty(bboxes)
I = insertObjectAnnotation(I,'rectangle',bboxes,scores);
figure
imshow(I)
end
Thank you.
  5 件のコメント
Oscar Lema
Oscar Lema 2020 年 11 月 22 日
While training or before? I trainned with that table (later I create DataStores). Maybe your error is that you've incorrent BoundingBoxes, I mean a 416x416 image and a [400 400 50 50] BoundingBox, so xmax and ymax would be 450 and 450. This maybe your problem, becasuse you have a 416x416 image so it's imposible to have [xmax ymax] = [450 450].
Are you using the correct BoundingBox format [xmin ymin with height]?
Oscar Lema
Oscar Lema 2020 年 11 月 22 日
I attach the test DataStore. The train DataStore should follow the same format (just changing data). You can load it on Matlab and see its structure. It's working for me.

サインインしてコメントする。

回答 (0 件)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by