MATLAB Answers

Creating RCNN Detection Using Transfer Learning

36 ビュー (過去 30 日間)
Matpar
Matpar 2020 年 1 月 17 日
コメント済み: Matpar 2020 年 1 月 26 日 15:21
Hi Professionals,
can a professional guide in obtaing the layers, I keep getting this error no matter what i change!
I am lacking some knowledge required for moving forward please assist!
Attach is the label for the files I tried uploading the image folder bit but it too large sorry!
This is my code please assist in pointing me in the right direction, really would like to solve this! it's metally chanllenging when the errors are seemingly invisible!
%% Train R-CNN Stop Sign Detector
% Load training data and network layers.
load('gunx.mat', 'guntr', 'layers')
%% Add the image directory to the MATLAB path.
imDir = fullfile(matlabroot, 'toolbox', 'vision', 'visiondata','gunsGT');
addpath(imDir);
%% Set network training options to use mini-batch size of 32 to reduce
% GPU/CPU memory usage. Lower the InitialLearnRate to reduce the rate at which
% network parameters are changed. This is beneficial when fine-tuning a
% pre-trained network and prevents the network from changing too rapidly.
options = trainingOptions('sgdm','MiniBatchSize', 32,'InitialLearnRate', 1e-6,'MaxEpochs', 10);
%% Train the R-CNN detector. Training can take a few minutes to complete.
rcnn = trainRCNNObjectDetector(gunsGT, layers, options, 'NegativeOverlapRange', [0 0.3]);
%% Test the R-CNN detector on a test image.
img = imread('Gun00012.jpg');
[bbox, score, label] = detect(rcnn, img, 'MiniBatchSize', 32);
%% Display strongest detection result.
[score, idx] = max(score);
bbox = bbox(idx, :);
annotation = sprintf('%s: (Confidence = %f)', label(idx), score);
detectedImg = insertObjectAnnotation(img, 'rectangle', bbox, annotation);
figure
imshow(detectedImg)
%% Remove the image directory from the path.
rmpath(imDir);
This is my errors:
>> guntest2
Warning: Variable 'layers' not found.
> In guntest2 (line 3)
Undefined function or variable 'gunsGT'.
Error in guntest2 (line 16)
rcnn = trainRCNNObjectDetector(gunsGT, layers, options, 'NegativeOverlapRange', [0 0.3]);

  0 件のコメント

サインイン to comment.

回答 (1 件)

Shashank Gupta
Shashank Gupta 2020 年 1 月 20 日
Hi Matpar,
I am not sure what “gunx.mat” file contains, but by looking at the error message, one can decode that the variable name “layers” not been found. Can you check the mat file once more, does it contain the required “layers” variable?
The second error message is related to “gunsGT” which I do not have any access to, but again looking at the error it seems like gunsGT is not the right argument which needs to be passed in trainRCNNObjectDetector, this detector function must have a datastore or {image,label} array as the first argument.
Also if you want me to further investigate, attach all the required mat file and function which is used.
I hope this helps.

  3 件のコメント

Matpar
Matpar 2020 年 1 月 20 日
Hi @
Shashank Gupta, I found that error and thanx for responding to my error! appreciate it loads!
The data was not in the table actually, The issue I am having now is that the bounding boxes are not around the region of interest!
Ok I will attach all files now for processing
Can you assist me with this error please?
I have the code right, I think to the best of my knowledge but for some reason I am not seeing the errors that causing the bounding box to "NOT" appear!
Please see my code!
clear
clc
% deepNetworkDesigner
gunfolder = '/Users/mmgp/Desktop/gunsGT';
save('gunlables.mat','gunfolder');
%% Specifying Image Amount In Specified Folder
total_images = numel(gunfolder);
%% Accessing Content of Folder TrainingSet Using Datastore
imds = imageDatastore(gunfolder,'IncludeSubFolders',true,'LabelSource','Foldernames');
%% Setting Output Function(images my have size variation resizing for consistency with pretrain net)
imds.ReadFcn=@(loc)imresize(imread(loc),[227,227]);
%% Counting Images In Each Category "If not equal this will create issues"
tbl=countEachLabel(imds);
%% Making Category The Same Number Of Images
minSetCount=min(tbl{:,2});
%% Splitting Inputs Into Training and Testing Sets
[imdsTrain,imdsValidation] = splitEachLabel(imds,0.7,'randomized');
size(imdsTrain);
%% Loading Pretrained Network
net = alexnet; %Trained on 1million+ images/classify images into 1000 object categories
% analyzeNetwork(net) % Display Alexnet architecture & network layer details
%% Alter InputSize Of 1st Layer/ Alexnet Image requirements is 277 width 277 height by 3 colour channels
inputSize = net.Layers(1).InputSize;%Displays the input size of Alexnet
%% Counting Total Number Of Images Including Subfolders **IF AVAILABLE**
imgTotal = length(imds.Files);
%% Displaying Multiple Randomized Images Within The Dataset
% a = 4;
% b = 4;
% n = randperm(imgTotal, a*b);
%
% figure(),
% Idx = 1;
% for j=1:a
% for k=1:b
% img=readimage(imds,n(Idx));
% subplot(a,b,Idx)
% imshow(img);
% Idx=Idx+1;
% end
% end
%% Replace Final Layer/Last 3 Configure For 1000 classes
% Finetuning these 3 layers for new classification
% Extracting all Layers except the last 3
layersTransfer = net.Layers(1:end-3);
%% List the image categories/Clases:
numClasses = numel(categories(imdsTrain.Labels));
layers = [
layersTransfer
fullyConnectedLayer(numClasses,'WeightLearnRateFactor',25,'BiasLearnRateFactor',25);
softmaxLayer
classificationLayer];
%% Training The Network
% Resizing images in datastore to meet Alexnet's size requirements
% Utilising Augmented Data Store for automatic resizing of training images
%% Augmented Data Store Prevents Over Fitting By Randomly Flipping Along The Vertical Axis
% Stopping the network from memorizing exact details of the training data
% Also Randomly Translates them up to 30 pixels horizontally & Vertically
pixelRange = [-30 30];
imageAugmenter = imageDataAugmenter( ...
'RandXReflection',true, ...
'RandXTranslation',pixelRange, ...
'RandYTranslation',pixelRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, ...
'DataAugmentation',imageAugmenter);
%% Utilising Data Augmentation For Resizing Validation Data
% implemented without specifying overfit prevention procedures
% By not specifying these procedures the system will be precise via
% predicitons
%% Resizing Images, Assists With Preventing Overfitting
augmentedTrainingSet = augmentedImageDatastore(inputSize ,imdsTrain,'ColorPreprocessing', 'gray2rgb');
augimdsValidation = augmentedImageDatastore(inputSize,imdsValidation,'ColorPreprocessing', 'gray2rgb');
%% Specifying Training Options
% Keep features from earlier layers of pretrained networked for transfer learning
% Specify epoch training cycle, the mini-batch size and validation data
% Validate the network for each iteration during training
% (SGDM)groups the full dataset into disjoint mini-batches This reaches convergence faster
% as it updates the network?s weight value more frequently increase the
% computationl speed
%% Implementing For ***VISUAL*** Graphical Representations
% options = trainingOptions('sgdm', ...
% 'MiniBatchSize',32, ...
% 'MaxEpochs',20, ...
% 'InitialLearnRate',0.004, ...
% 'Shuffle','every-epoch', ...
% 'ValidationData',augimdsValidation, ...
% 'ValidationFrequency',3, ...
% 'Verbose',true, ...
% 'Plots','training-progress');
%% Implementing **WITH** The RCNN Object Detector
opts = trainingOptions('sgdm',...
'Momentum',0.9,...
'InitialLearnRate', 1e-4,...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropFactor', 0.1, ...
'Shuffle','every-epoch', ...
'LearnRateDropPeriod', 8, ...
'L2Regularization', 1e-4, ...
'MaxEpochs', 100,...
'MiniBatchSize',128,...
'Verbose', true);
[height,width,numChannels, ~] = size(imdsTrain);
imageSize = [height width numChannels];
inputLayer = imageInputLayer(imageSize);
%% Training network Consisting Of Transferred & New Layers.
netTransfer = trainNetwork(augmentedTrainingSet,layers,opts);
%% Classifying Validation Images Utilising Fine-tuned Network
[YPred,scores] = classify(netTransfer,augimdsValidation);
%% Displaying 4 Validation Image Samples With Predicted Labels
% idx = randperm(numel(imdsValidation.Files),4);
% figure
% for i = 1:4
% subplot(2,2,i)
% I = readimage(imdsValidation,idx(i));
% imshow(I)
% label = YPred(idx(i));
% title(string(label));
% end
%% Calculating Validation Data Classification Accuracy (Accuracy Labels Predicted Accurately By Network)
YValidation = imdsValidation.Labels;
accuracy = mean(YPred == YValidation);
%% Training the R-CNN detector. Training can take a few minutes to complete.
% Loading .MAT file, the ground truths and the Network layers
load('gTruth.mat')
% Positive and Negative Overlap Range Controls Which Image Patch is Used
rcnn = trainRCNNObjectDetector(gTruth, netTransfer, opts, 'NegativeOverlapRange', [0 0.3]);
%% Testing the R-CNN detector on a test image.
testimg = imread('59.jpg');
[bboxes,score,label] = detect(rcnn,testimg,'MiniBatchSize',128)
%% Display strongest detection result.
[score, idx] = max(score);
bbox = bboxes(idx, :);
annotation = sprintf('%s: (Confidence = %f)', label(idx), score);
Imgdetected = insertObjectAnnotation(testimg, 'rectangle', bbox, annotation);
figure
imshow(Imgdetected);
Matpar
Matpar 2020 年 1 月 20 日
I got the error for the bounding box @
Shashank Gupta is was evading me! I forgot to clear the workspace and I have been beating myself all the while trying to understand why the box was not being drawn!
I think fatigue is taking it's toll now. Been up for a while trying to solve this. I prevailed but thanx for answering the call for assistance I really appreciate it loads!
Have a great day...
Matpar
Matpar 2020 年 1 月 26 日 15:21
Hi SG,
I just did it over and got the same result, what am i soing wrong, please teach me? I think by now i should have solve this and still i am challenged. i need a push please.
this is my code:
clc
clear
net = alexnet
layers = [
imageInputLayer([227 227 3],"Name","data")
convolution2dLayer([11 11],96,"Name","conv1","BiasLearnRateFactor",2,"Stride",[4 4])
reluLayer("Name","relu1")
crossChannelNormalizationLayer(5,"Name","norm1","K",1)
maxPooling2dLayer([3 3],"Name","pool1","Stride",[2 2])
groupedConvolution2dLayer([5 5],128,2,"Name","conv2","BiasLearnRateFactor",2,"Padding",[2 2 2 2])
reluLayer("Name","relu2")
crossChannelNormalizationLayer(5,"Name","norm2","K",1)
maxPooling2dLayer([3 3],"Name","pool2","Stride",[2 2])
convolution2dLayer([3 3],384,"Name","conv3","BiasLearnRateFactor",2,"Padding",[1 1 1 1])
reluLayer("Name","relu3")
groupedConvolution2dLayer([3 3],192,2,"Name","conv4","BiasLearnRateFactor",2,"Padding",[1 1 1 1])
reluLayer("Name","relu4")
groupedConvolution2dLayer([3 3],128,2,"Name","conv5","BiasLearnRateFactor",2,"Padding",[1 1 1 1])
reluLayer("Name","relu5")
maxPooling2dLayer([3 3],"Name","pool5","Stride",[2 2])
fullyConnectedLayer(4096,"Name","fc6","BiasLearnRateFactor",2)
reluLayer("Name","relu6")
dropoutLayer(0.5,"Name","drop6")
fullyConnectedLayer(4096,"Name","fc7","BiasLearnRateFactor",2)
reluLayer("Name","relu7")
dropoutLayer(0.5,"Name","drop7")
fullyConnectedLayer(10,"Name","fc","BiasLearnRateFactor",10,"WeightLearnRateFactor",10)
softmaxLayer("Name","softmax")
classificationLayer("Name","classoutput")]
imfolder = '/Users/mmgp/Desktop/gunsGT';
filenames = dir(fullfile(imfolder,'*.jpg'))
total_images = numel(filenames);
load('gTruth.mat','filenames')
% for n = 1:total_images
% f = fullfile(imfolder,filenames(n).name);
% myims = imread(f);
% figure(n)
% imshow(myims);
% end
imds = imageDatastore(imfolder,'IncludeSubFolders',true,'LabelSource','Foldernames')
[imdsTrain,imdsValidation] = splitEachLabel(imds,0.7,'randomized');
inputSize = net.Layers(1).InputSize
layersTransfer = net.Layers(1:end-3);
numClasses = numel(categories(imdsTrain.Labels));
Tlayers = [
layersTransfer
fullyConnectedLayer(numClasses,'WeightLearnRateFactor',80,'BiasLearnRateFactor',80);
softmaxLayer
classificationLayer];
size(inputSize)
imds.ReadFcn=@(loc)imresize(imread(loc),[227,227])
pixelRange = [-30 30];
imageAugmenter = imageDataAugmenter( ...
'RandXReflection',true, ...
'RandXTranslation',pixelRange, ...
'RandYTranslation',pixelRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, ...
'DataAugmentation',imageAugmenter)
augTrainingSet = augmentedImageDatastore(inputSize ,imdsTrain,'ColorPreprocessing', 'gray2rgb');
augValidation = augmentedImageDatastore(inputSize,imdsValidation,'ColorPreprocessing', 'gray2rgb');
layer = 'fc7';
featuresTrain = activations(net,augTrainingSet,layer,'OutputAs','rows');
featuresTest = activations(net,augValidation,layer,'OutputAs','rows');
YTrain = imdsTrain.Labels;
YTest = imdsValidation.Labels;
mdl = fitcecoc(featuresTrain,YTrain)
YPred = predict(mdl,featuresTest);
opts = trainingOptions('sgdm',...
'Momentum',0.9,...
'InitialLearnRate', 1e-4,...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropFactor', 0.1, ...
'Shuffle','every-epoch', ...
'LearnRateDropPeriod', 8, ...
'L2Regularization', 1e-4, ...
'MaxEpochs', 10,...
'MiniBatchSize',25,...
'Verbose', true);
trainedNet = trainNetwork(augTrainingSet,Tlayers,opts)
[YPred,probs] = classify(trainedNet,augValidation);
YValidation = imdsValidation.Labels;
Class_accuracy = mean(YPred == YTest)
idx = [1 5 10 15];
figure
for i = 1:numel(idx)
subplot(2,2,i)
I = readimage(imdsTrain,idx(i));
label = YPred(idx(i));
imshow(I)
title('Gun Predictions')
end
% idx = randperm(numel(imdsValidation.Files),16);
% figure
% for i = 1:16
% subplot(4,4,i)
% I = readimage(imdsValidation,idx(i));
% imshow(I)
% label = YPred(idx(i));
% title(string(label) + ", " + num2str(100*max(probs(idx(i),:)),16) + "%");
% end
load('gTruth.mat')
% Positive and Negative Overlap Range Controls Which Image Patch is Used
rcnn = trainRCNNObjectDetector(gTruth, trainedNet, opts, 'NegativeOverlapRange', [0 0.3]);
%% Step 21 Testing the R-CNN detector on a test image.
testimg = imread('Gun00011.jpg');
[bboxes,score,label] = detect(rcnn,testimg,'MiniBatchSize',25)
%% Step 22 Display strongest detection result.
[score, idx] = max(score);
bbox = bboxes(idx, :);
annotation = sprintf('%s: (Confidence = %f)', label(idx), score);
Imgdetected = insertObjectAnnotation(testimg, 'rectangle', bbox, annotation);
figure
imshow(Imgdetected);
stil the box is not showing :(
please help me i would like very much if you can point out my error and teach me how to code the solution please...

サインイン to comment.

サインイン してこの質問に回答します。


Translated by