現在この質問をフォロー中です
- フォローしているコンテンツ フィードに更新が表示されます。
- コミュニケーション基本設定に応じて電子メールを受け取ることができます。
Error: Unable to find variable while trying to recognize results using trained network.
3 ビュー (過去 30 日間)
古いコメントを表示
I am trying to detect pedestrian using thermal dataset. I was able to train using custome dataset using DarkNet53, using Deep Network Designer script. After that when I am trying to test the results I am facing error. Before it was working fine, now I am trying using new dataset but it should not be the problem. I have attached my .m file and some sample images, if you can check the code and let me know where I need to modify it would be really helpful as this is part of my project.
Thank you very much in advance.
14 件のコメント
Partha Dey
2023 年 1 月 16 日
編集済み: Partha Dey
2023 年 1 月 16 日
I am unable to attach the files as I have exceeded maximum number of attachments for the day. I am giving the full code which was working before and will attach the images once I am able to do it. Thank you very in advance.
% initialization parameters are the parameters of the initial pretrained network.
trainingSetup = load("C:\Users\Neel\Documents\MATLAB\YOLOv3ObjectDetection\DarkNet53\trainednetwork.mat");
%% Import Data
% Import training and validation data.
imdsTrain = imageDatastore("C:\Users\Neel\Documents\MATLAB\YOLOv3ObjectDetection\DarkNet53\Dataset1","IncludeSubfolders",true,"LabelSource","foldernames");
[imdsTrain, imdsValidation] = splitEachLabel(imdsTrain,0.7);
%% Augmentation Settings
imageAugmenter = imageDataAugmenter(...
"RandRotation",[-90 90],...
"RandScale",[1 2],...
"RandXReflection",true);
%Resize the images to match the network input layer.
augimdsTrain = augmentedImageDatastore([256 256 3],imdsTrain,"DataAugmentation",imageAugmenter);
augimdsValidation = augmentedImageDatastore([256 256 3],imdsValidation);
%% Set Training Options
% Specify options to use when training.
opts = trainingOptions("sgdm",...
"ExecutionEnvironment","auto",...
"InitialLearnRate",0.0001,...
"LearnRateDropFactor",0.2,...
"LearnRateDropPeriod",5,...
"MaxEpochs",20,...
"MiniBatchSize",30,...
"Shuffle","every-epoch",...
"ValidationFrequency",13,...
"Plots","training-progress",...
"ValidationData",augimdsValidation);
%% Create Layer Graph
% Create the layer graph variable to contain the network layers.
lgraph = layerGraph();
%% Add Layer Branches
% Add the branches of the network to the layer graph. Each branch is a linear
% array of layers.
tempLayers = [imageInputLayer([256 256 3],"Name","input","Normalization","rescale-zero-one","Max",trainingSetup.input.Max,"Min",trainingSetup.input.Min)
convolution2dLayer([3 3],32,"Name","conv1","Padding","same","Bias",trainingSetup.conv1.Bias,"Weights",trainingSetup.conv1.Weights)
batchNormalizationLayer("Name","batchnorm1","Offset",trainingSetup.batchnorm1.Offset,"Scale",trainingSetup.batchnorm1.Scale,"TrainedMean",trainingSetup.batchnorm1.TrainedMean,"TrainedVariance",trainingSetup.batchnorm1.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu1")
convolution2dLayer([3 3],64,"Name","conv2","Padding",[1 0 1 0],"Stride",[2 2],"Bias",trainingSetup.conv2.Bias,"Weights",trainingSetup.conv2.Weights)
batchNormalizationLayer("Name","batchnorm2","Offset",trainingSetup.batchnorm2.Offset,"Scale",trainingSetup.batchnorm2.Scale,"TrainedMean",trainingSetup.batchnorm2.TrainedMean,"TrainedVariance",trainingSetup.batchnorm2.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu2")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],32,"Name","conv3","Padding","same","Bias",trainingSetup.conv3.Bias,"Weights",trainingSetup.conv3.Weights)
batchNormalizationLayer("Name","batchnorm3","Offset",trainingSetup.batchnorm3.Offset,"Scale",trainingSetup.batchnorm3.Scale,"TrainedMean",trainingSetup.batchnorm3.TrainedMean,"TrainedVariance",trainingSetup.batchnorm3.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu3")
convolution2dLayer([3 3],64,"Name","conv4","Padding","same","Bias",trainingSetup.conv4.Bias,"Weights",trainingSetup.conv4.Weights)
batchNormalizationLayer("Name","batchnorm4","Offset",trainingSetup.batchnorm4.Offset,"Scale",trainingSetup.batchnorm4.Scale,"TrainedMean",trainingSetup.batchnorm4.TrainedMean,"TrainedVariance",trainingSetup.batchnorm4.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu4")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","res1")
convolution2dLayer([3 3],128,"Name","conv5","Padding",[1 0 1 0],"Stride",[2 2],"Bias",trainingSetup.conv5.Bias,"Weights",trainingSetup.conv5.Weights)
batchNormalizationLayer("Name","batchnorm5","Offset",trainingSetup.batchnorm5.Offset,"Scale",trainingSetup.batchnorm5.Scale,"TrainedMean",trainingSetup.batchnorm5.TrainedMean,"TrainedVariance",trainingSetup.batchnorm5.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu5")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],64,"Name","conv6","Padding","same","Bias",trainingSetup.conv6.Bias,"Weights",trainingSetup.conv6.Weights)
batchNormalizationLayer("Name","batchnorm6","Offset",trainingSetup.batchnorm6.Offset,"Scale",trainingSetup.batchnorm6.Scale,"TrainedMean",trainingSetup.batchnorm6.TrainedMean,"TrainedVariance",trainingSetup.batchnorm6.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu6")
convolution2dLayer([3 3],128,"Name","conv7","Padding","same","Bias",trainingSetup.conv7.Bias,"Weights",trainingSetup.conv7.Weights)
batchNormalizationLayer("Name","batchnorm7","Offset",trainingSetup.batchnorm7.Offset,"Scale",trainingSetup.batchnorm7.Scale,"TrainedMean",trainingSetup.batchnorm7.TrainedMean,"TrainedVariance",trainingSetup.batchnorm7.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu7")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res2");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],64,"Name","conv8","Padding","same","Bias",trainingSetup.conv8.Bias,"Weights",trainingSetup.conv8.Weights)
batchNormalizationLayer("Name","batchnorm8","Offset",trainingSetup.batchnorm8.Offset,"Scale",trainingSetup.batchnorm8.Scale,"TrainedMean",trainingSetup.batchnorm8.TrainedMean,"TrainedVariance",trainingSetup.batchnorm8.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu8")
convolution2dLayer([3 3],128,"Name","conv9","Padding","same","Bias",trainingSetup.conv9.Bias,"Weights",trainingSetup.conv9.Weights)
batchNormalizationLayer("Name","batchnorm9","Offset",trainingSetup.batchnorm9.Offset,"Scale",trainingSetup.batchnorm9.Scale,"TrainedMean",trainingSetup.batchnorm9.TrainedMean,"TrainedVariance",trainingSetup.batchnorm9.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu9")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","res3")
convolution2dLayer([3 3],256,"Name","conv10","Padding",[1 0 1 0],"Stride",[2 2],"Bias",trainingSetup.conv10.Bias,"Weights",trainingSetup.conv10.Weights)
batchNormalizationLayer("Name","batchnorm10","Offset",trainingSetup.batchnorm10.Offset,"Scale",trainingSetup.batchnorm10.Scale,"TrainedMean",trainingSetup.batchnorm10.TrainedMean,"TrainedVariance",trainingSetup.batchnorm10.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu10")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv11","Padding","same","Bias",trainingSetup.conv11.Bias,"Weights",trainingSetup.conv11.Weights)
batchNormalizationLayer("Name","batchnorm11","Offset",trainingSetup.batchnorm11.Offset,"Scale",trainingSetup.batchnorm11.Scale,"TrainedMean",trainingSetup.batchnorm11.TrainedMean,"TrainedVariance",trainingSetup.batchnorm11.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu11")
convolution2dLayer([3 3],256,"Name","conv12","Padding","same","Bias",trainingSetup.conv12.Bias,"Weights",trainingSetup.conv12.Weights)
batchNormalizationLayer("Name","batchnorm12","Offset",trainingSetup.batchnorm12.Offset,"Scale",trainingSetup.batchnorm12.Scale,"TrainedMean",trainingSetup.batchnorm12.TrainedMean,"TrainedVariance",trainingSetup.batchnorm12.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu12")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res4");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv13","Padding","same","Bias",trainingSetup.conv13.Bias,"Weights",trainingSetup.conv13.Weights)
batchNormalizationLayer("Name","batchnorm13","Offset",trainingSetup.batchnorm13.Offset,"Scale",trainingSetup.batchnorm13.Scale,"TrainedMean",trainingSetup.batchnorm13.TrainedMean,"TrainedVariance",trainingSetup.batchnorm13.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu13")
convolution2dLayer([3 3],256,"Name","conv14","Padding","same","Bias",trainingSetup.conv14.Bias,"Weights",trainingSetup.conv14.Weights)
batchNormalizationLayer("Name","batchnorm14","Offset",trainingSetup.batchnorm14.Offset,"Scale",trainingSetup.batchnorm14.Scale,"TrainedMean",trainingSetup.batchnorm14.TrainedMean,"TrainedVariance",trainingSetup.batchnorm14.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu14")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res5");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv15","Padding","same","Bias",trainingSetup.conv15.Bias,"Weights",trainingSetup.conv15.Weights)
batchNormalizationLayer("Name","batchnorm15","Offset",trainingSetup.batchnorm15.Offset,"Scale",trainingSetup.batchnorm15.Scale,"TrainedMean",trainingSetup.batchnorm15.TrainedMean,"TrainedVariance",trainingSetup.batchnorm15.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu15")
convolution2dLayer([3 3],256,"Name","conv16","Padding","same","Bias",trainingSetup.conv16.Bias,"Weights",trainingSetup.conv16.Weights)
batchNormalizationLayer("Name","batchnorm16","Offset",trainingSetup.batchnorm16.Offset,"Scale",trainingSetup.batchnorm16.Scale,"TrainedMean",trainingSetup.batchnorm16.TrainedMean,"TrainedVariance",trainingSetup.batchnorm16.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu16")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res6");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv17","Padding","same","Bias",trainingSetup.conv17.Bias,"Weights",trainingSetup.conv17.Weights)
batchNormalizationLayer("Name","batchnorm17","Offset",trainingSetup.batchnorm17.Offset,"Scale",trainingSetup.batchnorm17.Scale,"TrainedMean",trainingSetup.batchnorm17.TrainedMean,"TrainedVariance",trainingSetup.batchnorm17.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu17")
convolution2dLayer([3 3],256,"Name","conv18","Padding","same","Bias",trainingSetup.conv18.Bias,"Weights",trainingSetup.conv18.Weights)
batchNormalizationLayer("Name","batchnorm18","Offset",trainingSetup.batchnorm18.Offset,"Scale",trainingSetup.batchnorm18.Scale,"TrainedMean",trainingSetup.batchnorm18.TrainedMean,"TrainedVariance",trainingSetup.batchnorm18.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu18")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res7");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv19","Padding","same","Bias",trainingSetup.conv19.Bias,"Weights",trainingSetup.conv19.Weights)
batchNormalizationLayer("Name","batchnorm19","Offset",trainingSetup.batchnorm19.Offset,"Scale",trainingSetup.batchnorm19.Scale,"TrainedMean",trainingSetup.batchnorm19.TrainedMean,"TrainedVariance",trainingSetup.batchnorm19.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu19")
convolution2dLayer([3 3],256,"Name","conv20","Padding","same","Bias",trainingSetup.conv20.Bias,"Weights",trainingSetup.conv20.Weights)
batchNormalizationLayer("Name","batchnorm20","Offset",trainingSetup.batchnorm20.Offset,"Scale",trainingSetup.batchnorm20.Scale,"TrainedMean",trainingSetup.batchnorm20.TrainedMean,"TrainedVariance",trainingSetup.batchnorm20.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu20")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res8");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv21","Padding","same","Bias",trainingSetup.conv21.Bias,"Weights",trainingSetup.conv21.Weights)
batchNormalizationLayer("Name","batchnorm21","Offset",trainingSetup.batchnorm21.Offset,"Scale",trainingSetup.batchnorm21.Scale,"TrainedMean",trainingSetup.batchnorm21.TrainedMean,"TrainedVariance",trainingSetup.batchnorm21.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu21")
convolution2dLayer([3 3],256,"Name","conv22","Padding","same","Bias",trainingSetup.conv22.Bias,"Weights",trainingSetup.conv22.Weights)
batchNormalizationLayer("Name","batchnorm22","Offset",trainingSetup.batchnorm22.Offset,"Scale",trainingSetup.batchnorm22.Scale,"TrainedMean",trainingSetup.batchnorm22.TrainedMean,"TrainedVariance",trainingSetup.batchnorm22.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu22")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res9");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv23","Padding","same","Bias",trainingSetup.conv23.Bias,"Weights",trainingSetup.conv23.Weights)
batchNormalizationLayer("Name","batchnorm23","Offset",trainingSetup.batchnorm23.Offset,"Scale",trainingSetup.batchnorm23.Scale,"TrainedMean",trainingSetup.batchnorm23.TrainedMean,"TrainedVariance",trainingSetup.batchnorm23.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu23")
convolution2dLayer([3 3],256,"Name","conv24","Padding","same","Bias",trainingSetup.conv24.Bias,"Weights",trainingSetup.conv24.Weights)
batchNormalizationLayer("Name","batchnorm24","Offset",trainingSetup.batchnorm24.Offset,"Scale",trainingSetup.batchnorm24.Scale,"TrainedMean",trainingSetup.batchnorm24.TrainedMean,"TrainedVariance",trainingSetup.batchnorm24.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu24")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res10");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","conv25","Padding","same","Bias",trainingSetup.conv25.Bias,"Weights",trainingSetup.conv25.Weights)
batchNormalizationLayer("Name","batchnorm25","Offset",trainingSetup.batchnorm25.Offset,"Scale",trainingSetup.batchnorm25.Scale,"TrainedMean",trainingSetup.batchnorm25.TrainedMean,"TrainedVariance",trainingSetup.batchnorm25.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu25")
convolution2dLayer([3 3],256,"Name","conv26","Padding","same","Bias",trainingSetup.conv26.Bias,"Weights",trainingSetup.conv26.Weights)
batchNormalizationLayer("Name","batchnorm26","Offset",trainingSetup.batchnorm26.Offset,"Scale",trainingSetup.batchnorm26.Scale,"TrainedMean",trainingSetup.batchnorm26.TrainedMean,"TrainedVariance",trainingSetup.batchnorm26.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu26")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","res11")
convolution2dLayer([3 3],512,"Name","conv27","Padding",[1 0 1 0],"Stride",[2 2],"Bias",trainingSetup.conv27.Bias,"Weights",trainingSetup.conv27.Weights)
batchNormalizationLayer("Name","batchnorm27","Offset",trainingSetup.batchnorm27.Offset,"Scale",trainingSetup.batchnorm27.Scale,"TrainedMean",trainingSetup.batchnorm27.TrainedMean,"TrainedVariance",trainingSetup.batchnorm27.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu27")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv28","Padding","same","Bias",trainingSetup.conv28.Bias,"Weights",trainingSetup.conv28.Weights)
batchNormalizationLayer("Name","batchnorm28","Offset",trainingSetup.batchnorm28.Offset,"Scale",trainingSetup.batchnorm28.Scale,"TrainedMean",trainingSetup.batchnorm28.TrainedMean,"TrainedVariance",trainingSetup.batchnorm28.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu28")
convolution2dLayer([3 3],512,"Name","conv29","Padding","same","Bias",trainingSetup.conv29.Bias,"Weights",trainingSetup.conv29.Weights)
batchNormalizationLayer("Name","batchnorm29","Offset",trainingSetup.batchnorm29.Offset,"Scale",trainingSetup.batchnorm29.Scale,"TrainedMean",trainingSetup.batchnorm29.TrainedMean,"TrainedVariance",trainingSetup.batchnorm29.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu29")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res12");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv30","Padding","same","Bias",trainingSetup.conv30.Bias,"Weights",trainingSetup.conv30.Weights)
batchNormalizationLayer("Name","batchnorm30","Offset",trainingSetup.batchnorm30.Offset,"Scale",trainingSetup.batchnorm30.Scale,"TrainedMean",trainingSetup.batchnorm30.TrainedMean,"TrainedVariance",trainingSetup.batchnorm30.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu30")
convolution2dLayer([3 3],512,"Name","conv31","Padding","same","Bias",trainingSetup.conv31.Bias,"Weights",trainingSetup.conv31.Weights)
batchNormalizationLayer("Name","batchnorm31","Offset",trainingSetup.batchnorm31.Offset,"Scale",trainingSetup.batchnorm31.Scale,"TrainedMean",trainingSetup.batchnorm31.TrainedMean,"TrainedVariance",trainingSetup.batchnorm31.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu31")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res13");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv32","Padding","same","Bias",trainingSetup.conv32.Bias,"Weights",trainingSetup.conv32.Weights)
batchNormalizationLayer("Name","batchnorm32","Offset",trainingSetup.batchnorm32.Offset,"Scale",trainingSetup.batchnorm32.Scale,"TrainedMean",trainingSetup.batchnorm32.TrainedMean,"TrainedVariance",trainingSetup.batchnorm32.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu32")
convolution2dLayer([3 3],512,"Name","conv33","Padding","same","Bias",trainingSetup.conv33.Bias,"Weights",trainingSetup.conv33.Weights)
batchNormalizationLayer("Name","batchnorm33","Offset",trainingSetup.batchnorm33.Offset,"Scale",trainingSetup.batchnorm33.Scale,"TrainedMean",trainingSetup.batchnorm33.TrainedMean,"TrainedVariance",trainingSetup.batchnorm33.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu33")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res14");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv34","Padding","same","Bias",trainingSetup.conv34.Bias,"Weights",trainingSetup.conv34.Weights)
batchNormalizationLayer("Name","batchnorm34","Offset",trainingSetup.batchnorm34.Offset,"Scale",trainingSetup.batchnorm34.Scale,"TrainedMean",trainingSetup.batchnorm34.TrainedMean,"TrainedVariance",trainingSetup.batchnorm34.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu34")
convolution2dLayer([3 3],512,"Name","conv35","Padding","same","Bias",trainingSetup.conv35.Bias,"Weights",trainingSetup.conv35.Weights)
batchNormalizationLayer("Name","batchnorm35","Offset",trainingSetup.batchnorm35.Offset,"Scale",trainingSetup.batchnorm35.Scale,"TrainedMean",trainingSetup.batchnorm35.TrainedMean,"TrainedVariance",trainingSetup.batchnorm35.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu35")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res15");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv36","Padding","same","Bias",trainingSetup.conv36.Bias,"Weights",trainingSetup.conv36.Weights)
batchNormalizationLayer("Name","batchnorm36","Offset",trainingSetup.batchnorm36.Offset,"Scale",trainingSetup.batchnorm36.Scale,"TrainedMean",trainingSetup.batchnorm36.TrainedMean,"TrainedVariance",trainingSetup.batchnorm36.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu36")
convolution2dLayer([3 3],512,"Name","conv37","Padding","same","Bias",trainingSetup.conv37.Bias,"Weights",trainingSetup.conv37.Weights)
batchNormalizationLayer("Name","batchnorm37","Offset",trainingSetup.batchnorm37.Offset,"Scale",trainingSetup.batchnorm37.Scale,"TrainedMean",trainingSetup.batchnorm37.TrainedMean,"TrainedVariance",trainingSetup.batchnorm37.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu37")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res16");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv38","Padding","same","Bias",trainingSetup.conv38.Bias,"Weights",trainingSetup.conv38.Weights)
batchNormalizationLayer("Name","batchnorm38","Offset",trainingSetup.batchnorm38.Offset,"Scale",trainingSetup.batchnorm38.Scale,"TrainedMean",trainingSetup.batchnorm38.TrainedMean,"TrainedVariance",trainingSetup.batchnorm38.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu38")
convolution2dLayer([3 3],512,"Name","conv39","Padding","same","Bias",trainingSetup.conv39.Bias,"Weights",trainingSetup.conv39.Weights)
batchNormalizationLayer("Name","batchnorm39","Offset",trainingSetup.batchnorm39.Offset,"Scale",trainingSetup.batchnorm39.Scale,"TrainedMean",trainingSetup.batchnorm39.TrainedMean,"TrainedVariance",trainingSetup.batchnorm39.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu39")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res17");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv40","Padding","same","Bias",trainingSetup.conv40.Bias,"Weights",trainingSetup.conv40.Weights)
batchNormalizationLayer("Name","batchnorm40","Offset",trainingSetup.batchnorm40.Offset,"Scale",trainingSetup.batchnorm40.Scale,"TrainedMean",trainingSetup.batchnorm40.TrainedMean,"TrainedVariance",trainingSetup.batchnorm40.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu40")
convolution2dLayer([3 3],512,"Name","conv41","Padding","same","Bias",trainingSetup.conv41.Bias,"Weights",trainingSetup.conv41.Weights)
batchNormalizationLayer("Name","batchnorm41","Offset",trainingSetup.batchnorm41.Offset,"Scale",trainingSetup.batchnorm41.Scale,"TrainedMean",trainingSetup.batchnorm41.TrainedMean,"TrainedVariance",trainingSetup.batchnorm41.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu41")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res18");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","conv42","Padding","same","Bias",trainingSetup.conv42.Bias,"Weights",trainingSetup.conv42.Weights)
batchNormalizationLayer("Name","batchnorm42","Offset",trainingSetup.batchnorm42.Offset,"Scale",trainingSetup.batchnorm42.Scale,"TrainedMean",trainingSetup.batchnorm42.TrainedMean,"TrainedVariance",trainingSetup.batchnorm42.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu42")
convolution2dLayer([3 3],512,"Name","conv43","Padding","same","Bias",trainingSetup.conv43.Bias,"Weights",trainingSetup.conv43.Weights)
batchNormalizationLayer("Name","batchnorm43","Offset",trainingSetup.batchnorm43.Offset,"Scale",trainingSetup.batchnorm43.Scale,"TrainedMean",trainingSetup.batchnorm43.TrainedMean,"TrainedVariance",trainingSetup.batchnorm43.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu43")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","res19")
convolution2dLayer([3 3],1024,"Name","conv44","Padding",[1 0 1 0],"Stride",[2 2],"Bias",trainingSetup.conv44.Bias,"Weights",trainingSetup.conv44.Weights)
batchNormalizationLayer("Name","batchnorm44","Offset",trainingSetup.batchnorm44.Offset,"Scale",trainingSetup.batchnorm44.Scale,"TrainedMean",trainingSetup.batchnorm44.TrainedMean,"TrainedVariance",trainingSetup.batchnorm44.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu44")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","conv45","Padding","same","Bias",trainingSetup.conv45.Bias,"Weights",trainingSetup.conv45.Weights)
batchNormalizationLayer("Name","batchnorm45","Offset",trainingSetup.batchnorm45.Offset,"Scale",trainingSetup.batchnorm45.Scale,"TrainedMean",trainingSetup.batchnorm45.TrainedMean,"TrainedVariance",trainingSetup.batchnorm45.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu45")
convolution2dLayer([3 3],1024,"Name","conv46","Padding","same","Bias",trainingSetup.conv46.Bias,"Weights",trainingSetup.conv46.Weights)
batchNormalizationLayer("Name","batchnorm46","Offset",trainingSetup.batchnorm46.Offset,"Scale",trainingSetup.batchnorm46.Scale,"TrainedMean",trainingSetup.batchnorm46.TrainedMean,"TrainedVariance",trainingSetup.batchnorm46.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu46")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res20");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","conv47","Padding","same","Bias",trainingSetup.conv47.Bias,"Weights",trainingSetup.conv47.Weights)
batchNormalizationLayer("Name","batchnorm47","Offset",trainingSetup.batchnorm47.Offset,"Scale",trainingSetup.batchnorm47.Scale,"TrainedMean",trainingSetup.batchnorm47.TrainedMean,"TrainedVariance",trainingSetup.batchnorm47.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu47")
convolution2dLayer([3 3],1024,"Name","conv48","Padding","same","Bias",trainingSetup.conv48.Bias,"Weights",trainingSetup.conv48.Weights)
batchNormalizationLayer("Name","batchnorm48","Offset",trainingSetup.batchnorm48.Offset,"Scale",trainingSetup.batchnorm48.Scale,"TrainedMean",trainingSetup.batchnorm48.TrainedMean,"TrainedVariance",trainingSetup.batchnorm48.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu48")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res21");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","conv49","Padding","same","Bias",trainingSetup.conv49.Bias,"Weights",trainingSetup.conv49.Weights)
batchNormalizationLayer("Name","batchnorm49","Offset",trainingSetup.batchnorm49.Offset,"Scale",trainingSetup.batchnorm49.Scale,"TrainedMean",trainingSetup.batchnorm49.TrainedMean,"TrainedVariance",trainingSetup.batchnorm49.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu49")
convolution2dLayer([3 3],1024,"Name","conv50","Padding","same","Bias",trainingSetup.conv50.Bias,"Weights",trainingSetup.conv50.Weights)
batchNormalizationLayer("Name","batchnorm50","Offset",trainingSetup.batchnorm50.Offset,"Scale",trainingSetup.batchnorm50.Scale,"TrainedMean",trainingSetup.batchnorm50.TrainedMean,"TrainedVariance",trainingSetup.batchnorm50.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu50")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name","res22");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","conv51","Padding","same","Bias",trainingSetup.conv51.Bias,"Weights",trainingSetup.conv51.Weights)
batchNormalizationLayer("Name","batchnorm51","Offset",trainingSetup.batchnorm51.Offset,"Scale",trainingSetup.batchnorm51.Scale,"TrainedMean",trainingSetup.batchnorm51.TrainedMean,"TrainedVariance",trainingSetup.batchnorm51.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu51")
convolution2dLayer([3 3],1024,"Name","conv52","Padding","same","Bias",trainingSetup.conv52.Bias,"Weights",trainingSetup.conv52.Weights)
batchNormalizationLayer("Name","batchnorm52","Offset",trainingSetup.batchnorm52.Offset,"Scale",trainingSetup.batchnorm52.Scale,"TrainedMean",trainingSetup.batchnorm52.TrainedMean,"TrainedVariance",trainingSetup.batchnorm52.TrainedVariance)
leakyReluLayer(0.1,"Name","leakyrelu52")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","res23")
globalAveragePooling2dLayer("Name","avg1")
convolution2dLayer([1 1],4,"Name","conv53","BiasLearnRateFactor",10,"Padding","same","WeightLearnRateFactor",10)
softmaxLayer("Name","softmax")
classificationLayer("Name","classoutput")];
lgraph = addLayers(lgraph,tempLayers);
% clean up helper variable
clear tempLayers;
%% Connect Layer Branches
% Connect all the branches of the network to create the network graph.
lgraph = connectLayers(lgraph,"leakyrelu2","conv3");
lgraph = connectLayers(lgraph,"leakyrelu2","res1/in2");
lgraph = connectLayers(lgraph,"leakyrelu4","res1/in1");
lgraph = connectLayers(lgraph,"leakyrelu5","conv6");
lgraph = connectLayers(lgraph,"leakyrelu5","res2/in2");
lgraph = connectLayers(lgraph,"leakyrelu7","res2/in1");
lgraph = connectLayers(lgraph,"res2","conv8");
lgraph = connectLayers(lgraph,"res2","res3/in2");
lgraph = connectLayers(lgraph,"leakyrelu9","res3/in1");
lgraph = connectLayers(lgraph,"leakyrelu10","conv11");
lgraph = connectLayers(lgraph,"leakyrelu10","res4/in2");
lgraph = connectLayers(lgraph,"leakyrelu12","res4/in1");
lgraph = connectLayers(lgraph,"res4","conv13");
lgraph = connectLayers(lgraph,"res4","res5/in2");
lgraph = connectLayers(lgraph,"leakyrelu14","res5/in1");
lgraph = connectLayers(lgraph,"res5","conv15");
lgraph = connectLayers(lgraph,"res5","res6/in2");
lgraph = connectLayers(lgraph,"leakyrelu16","res6/in1");
lgraph = connectLayers(lgraph,"res6","conv17");
lgraph = connectLayers(lgraph,"res6","res7/in2");
lgraph = connectLayers(lgraph,"leakyrelu18","res7/in1");
lgraph = connectLayers(lgraph,"res7","conv19");
lgraph = connectLayers(lgraph,"res7","res8/in2");
lgraph = connectLayers(lgraph,"leakyrelu20","res8/in1");
lgraph = connectLayers(lgraph,"res8","conv21");
lgraph = connectLayers(lgraph,"res8","res9/in2");
lgraph = connectLayers(lgraph,"leakyrelu22","res9/in1");
lgraph = connectLayers(lgraph,"res9","conv23");
lgraph = connectLayers(lgraph,"res9","res10/in2");
lgraph = connectLayers(lgraph,"leakyrelu24","res10/in1");
lgraph = connectLayers(lgraph,"res10","conv25");
lgraph = connectLayers(lgraph,"res10","res11/in2");
lgraph = connectLayers(lgraph,"leakyrelu26","res11/in1");
lgraph = connectLayers(lgraph,"leakyrelu27","conv28");
lgraph = connectLayers(lgraph,"leakyrelu27","res12/in2");
lgraph = connectLayers(lgraph,"leakyrelu29","res12/in1");
lgraph = connectLayers(lgraph,"res12","conv30");
lgraph = connectLayers(lgraph,"res12","res13/in2");
lgraph = connectLayers(lgraph,"leakyrelu31","res13/in1");
lgraph = connectLayers(lgraph,"res13","conv32");
lgraph = connectLayers(lgraph,"res13","res14/in2");
lgraph = connectLayers(lgraph,"leakyrelu33","res14/in1");
lgraph = connectLayers(lgraph,"res14","conv34");
lgraph = connectLayers(lgraph,"res14","res15/in2");
lgraph = connectLayers(lgraph,"leakyrelu35","res15/in1");
lgraph = connectLayers(lgraph,"res15","conv36");
lgraph = connectLayers(lgraph,"res15","res16/in2");
lgraph = connectLayers(lgraph,"leakyrelu37","res16/in1");
lgraph = connectLayers(lgraph,"res16","conv38");
lgraph = connectLayers(lgraph,"res16","res17/in2");
lgraph = connectLayers(lgraph,"leakyrelu39","res17/in1");
lgraph = connectLayers(lgraph,"res17","conv40");
lgraph = connectLayers(lgraph,"res17","res18/in2");
lgraph = connectLayers(lgraph,"leakyrelu41","res18/in1");
lgraph = connectLayers(lgraph,"res18","conv42");
lgraph = connectLayers(lgraph,"res18","res19/in2");
lgraph = connectLayers(lgraph,"leakyrelu43","res19/in1");
lgraph = connectLayers(lgraph,"leakyrelu44","conv45");
lgraph = connectLayers(lgraph,"leakyrelu44","res20/in2");
lgraph = connectLayers(lgraph,"leakyrelu46","res20/in1");
lgraph = connectLayers(lgraph,"res20","conv47");
lgraph = connectLayers(lgraph,"res20","res21/in2");
lgraph = connectLayers(lgraph,"leakyrelu48","res21/in1");
lgraph = connectLayers(lgraph,"res21","conv49");
lgraph = connectLayers(lgraph,"res21","res22/in2");
lgraph = connectLayers(lgraph,"leakyrelu50","res22/in1");
lgraph = connectLayers(lgraph,"res22","conv51");
lgraph = connectLayers(lgraph,"res22","res23/in2");
lgraph = connectLayers(lgraph,"leakyrelu52","res23/in1");
%% Train Network
% Train the network using the specified options and training data.
[net, traininfo] = trainNetwork(augimdsTrain,lgraph,opts);
%% Test Trained Network
%Select a new image to classify using the trained network.
I = imread('NEWFLIRPERSON1.jpg');
%Resize the test image to match the network input size.
%scale = [256 256];
I = imresize(I, [256 256]);
[YPred,probs] = classify(net,I);
gray = rgb2gray(I);
%imshow(gray)
img = imbinarize(gray,0.5);
%imshow(img)
img = bwareaopen(img,100);
%img = ~img_bin;
[bwLabel, num] = bwlabel(img);
bboxes = regionprops(bwLabel,'BoundingBox');
imshow(I)
label = YPred;
title(string(label) + ", " + num2str(100*max(probs),3) + "%");
hold on
for k = 1 : length(bboxes)
CurrBB = bboxes(k).BoundingBox;
rectangle('Position',[CurrBB(1),CurrBB(2),CurrBB(3),CurrBB(4)],'Edgecolor','y','Linewidth',2);
end
%rectangle('Position',bboxes.BoundingBox,'Edgecolor','y','Linewidth',2);
hold off
%}
Partha Dey
2023 年 1 月 17 日
Unrecognized function or variable 'net'
Error in TrainedNetwork (line 454)
[YPred,probs] = classify(net,I);
Walter Roberson
2023 年 1 月 17 日
trainingSetup = load("C:\Users\Neel\Documents\MATLAB\YOLOv3ObjectDetection\DarkNet53\trainednetwork.mat");
%% Import Data
% Import training and validation data.
imdsTrain = imageDatastore("C:\Users\Neel\Documents\MATLAB\YOLOv3ObjectDetection\DarkNet53\Dataset1","IncludeSubfolders",true,"LabelSource","foldernames");
We do not have that .mat file or those training directories.
Partha Dey
2023 年 1 月 18 日
Hello Sir.
This is .mat file which I trained using Deep Network Designer using Darknet53 and after the training I had exported it for the detection using this file.
For Deep Network Designer it will generate a live script after the training, that code I took and used again for my dataset, which worked before but now I am getting this function error while trying to test images for detection.
Some parameters or something I am missing with the function. Kindly check and assist me on this.
Thank you.
Walter Roberson
2023 年 1 月 18 日
I ran your code on my computer. It told me that it could not find the mat file on my computer.
Without those files I cannot test the code on my computer and debug the problem.
I can make guesses about what is wrong but they might be on the wrong track completely.
Partha Dey
2023 年 1 月 18 日
Alright Sir..
I will compress the .mat file and create one web link for that. I will post it here once it is done..
Thank you..
Partha Dey
2023 年 1 月 18 日
Dear Sir.
This is the link for the .mat file.
Thank you very much in advance.
Partha Dey
2023 年 1 月 28 日
Hello Sir, @Walter Roberson
Please find this link and you can change folder name.
Thank you very much in advance.
Walter Roberson
2023 年 1 月 29 日
trainingSetup = load("C:\Users\Neel\Documents\MATLAB\YOLOv3ObjectDetection\DarkNet53\trainednetwork.mat");
The result of load() of a .mat file, is a struct that has one field for each variable loaded from the file.
That particular file only has one variable, trainedNetwork_2
tempLayers = [imageInputLayer([256 256 3],"Name","input","Normalization","rescale-zero-one","Max",trainingSetup.input.Max,"Min",trainingSetup.input.Min)
convolution2dLayer([3 3],32,"Name","conv1","Padding","same","Bias",trainingSetup.conv1.Bias,"Weights",trainingSetup.conv1.Weights)
batchNormalizationLayer("Name","batchnorm1","Offset",trainingSetup.batchnorm1.Offset,"Scale",trainingSetup.batchnorm1.Scale,"TrainedMean",trainingSetup.batchnorm1.TrainedMean,"TrainedVariance",trainingSetup.batchnorm1.TrainedVariance)
The .mat file does not contain any variable named input or conv1 or batchnorm1 .
The DAGNetwork object trainedNetwork_2 does not contain fields or properties with those names either.
Partha Dey
2023 年 1 月 30 日
Alright Sir. Because after generating the live script using Deep Network Designer it was able to run and detect.
Can you suggest how to fix this code please? Thank you very much in advance.
回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Image Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!エラーが発生しました
ページに変更が加えられたため、アクションを完了できません。ページを再度読み込み、更新された状態を確認してください。
Web サイトの選択
Web サイトを選択すると、翻訳されたコンテンツにアクセスし、地域のイベントやサービスを確認できます。現在の位置情報に基づき、次のサイトの選択を推奨します:
また、以下のリストから Web サイトを選択することもできます。
最適なサイトパフォーマンスの取得方法
中国のサイト (中国語または英語) を選択することで、最適なサイトパフォーマンスが得られます。その他の国の MathWorks のサイトは、お客様の地域からのアクセスが最適化されていません。
南北アメリカ
- América Latina (Español)
- Canada (English)
- United States (English)
ヨーロッパ
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
アジア太平洋地域
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)