I am sure there are something totally worng but I just dont know where to get started. I am new to this.
68 points facial landmark detection based on CNN, how to reduce validation RMSE?
13 ビュー (過去 30 日間)
古いコメントを表示
I am currently a grade 4 student in university and my teacher asked to build a facial landmark algorithm in matlab based on cnn. I build a simple CNN for facial landmark regression but the result makes me confused, the validation loss is always very large and I dont know how to pull it down.
I have seen the tutorial in Matlab which is the regression problem of MNIST rotation angle, the RMSE is very low 0.1-0.01, but my RMSE is about 1-2.
%Get human face images and 68 landmark points for each face
imageFolder = 'E:\graduation project\image pre-processing\outdoor face\';
images = dir(fullfile(imageFolder,'*.png'));
pointsFolder = 'E:\graduation project\image pre-processing\outdoor mouth\';
ppoints = dir(fullfile(pointsFolder,'*.txt'));
%image size is 48
s=48;
imageSet=zeros(s,s,3,2*numel(images)); %the reaseon why 2*numel(images) is that I flip the images to increase training data
labelSet=zeros(2*numel(images),136);
for k=1:numel(images)
IF = fullfile(imageFolder,images(k).name);
PF = fullfile(pointsFolder,ppoints(k).name);
I = imread(IF);
if(size(I,3)==1) %if image is gray image, turn it into rgb image
I=cat(3,I,I,I);
end
P=dlmread(PF)';
imageSet(:,:,:,k)=double(I);
labelSet(k,:)=P;
end
imageSet(:,:,:,k+1:2*numel(images))=fliplr(imageSet(:,:,:,1:k)); %flip images
labelSet(k+1:2*numel(images),:)=[s-labelSet(1:k,1:68),labelSet(1:k,69:136)]; %calculate fliped image landmark coordinates 1-68 are x coord, 69-136 y coord
%Normalize input image and label
ImageSet=(imageSet-127.5)*0.0078125;
LabelSet=(labelSet-mean2(labelSet))/std2(labelSet);
XTrain=ImageSet(:,:,:,1:400);
YTrain=LabelSet(1:400,:);
XValidation=ImageSet(:,:,:,401:2*numel(images));
YValidation=LabelSet(401:2*numel(images),:);
%the input is 48 by 48 by 3 image, the output are 68 landmark coordinates (x,y), which in total is 68*2=136
layers = [
imageInputLayer([s s 3])
convolution2dLayer(3,32)
batchNormalizationLayer
preluLayer(32,'prelu1')
maxPooling2dLayer(3,'Stride',2)
convolution2dLayer(3,64)
batchNormalizationLayer
preluLayer(64,'prelu2')
maxPooling2dLayer(3,'Stride',2)
convolution2dLayer(3,64)
batchNormalizationLayer
preluLayer(64,'prelu3')
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,128)
preluLayer(128,'prelu4')
fullyConnectedLayer(256)
preluLayer(256,'prelu5')
fullyConnectedLayer(136)
regressionLayer];
miniBatchSize = 20;
validationFrequency = 200;
%floor(numel(YTrain)/miniBatchSize);
options = trainingOptions('sgdm', ...
'MiniBatchSize',miniBatchSize, ...
'MaxEpochs',1000, ...
'InitialLearnRate',1e-4, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',250, ...
'Shuffle','every-epoch', ...
'ValidationData',{XValidation,YValidation}, ...
'ValidationFrequency',validationFrequency, ...
'Plots','training-progress', ...
'Verbose',true);
回答 (2 件)
magheshbabu govindaraj
2019 年 3 月 13 日
preluLayer(32,'prelu1') does this layer exist in nueral network
0 件のコメント
Aasim Khurshid
2021 年 1 月 8 日
Hi Xuanyi,
It seems that your dataset size is too low for such large label size. You may try the folowing options:
- Transfer learning for learning the low level parameters and fine tune to your dataset.
- Increase the dataset size, If so, then I would recommend you also increase the depth of your network, because your network seems small for learning such detailed labels. You can also try inception layers to increase the network size.
Good luck.
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!