Neural Network hyperparameter tuning

7 ビュー (過去 30 日間)
Saeed Magsi
Saeed Magsi 2022 年 1 月 27 日
Hello. I have been working on hyperparameter tuning using bayesopt but i am getting an error [" The logical indices in position 2 contain a true value outside of the array bounds "]. I have actually two outputs. I have applied the following code but it did not work in my case. As it works on one output only. Can anyone help me on this. Regards.
% Make some data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1); % Minimum asymptotic error is .1
[m,n] = size(Daten) ;
% Split into train and test
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain = Training(:,1:n-1);
YTrain = Training(:,n);
XTest = Testing(:,1:n-1);
YTest = Testing(:,n);
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results)
% Train final model on full training set using the best hyperparameters
net = feedforwardnet(T.hiddenLayerSize, 'traingd');
net.trainParam.lr = T.lr;
net = train(net, XTrain', YTrain');
% Evaluate on test set and compute final rmse
ypred = net(XTest');
finalrmse = sqrt(mean((ypred - YTest').^2))
function rmse = kfoldLoss(x, y, cv, numHid, lr)
% Train net.
net = feedforwardnet(numHid, 'traingd');
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
rmse = sqrt(mean((ypred - y(cv.test)).^2));
end
  5 件のコメント
KSSV
KSSV 2022 年 1 月 29 日
Code works without any error in my version. What version you are using?
Saeed Magsi
Saeed Magsi 2022 年 1 月 29 日
編集済み: Walter Roberson 2022 年 1 月 31 日
@KSSV thank you for your response. Yes the single output code works fine for me as well without any error but it gives me the above mentioned error when i apply the code for two outputs.
Please try the below code with two outputs for the error. Regards.
clc;
clear;
Daten=rand(100,4);
Daten(:,4)=Daten(:,1)+Daten(:,2)+Daten(:,3)+.1*randn(100,1);
[m,n]=size(Daten);
Split into train and test
p=0.7;
Training=Daten(1:round(p*m),:);
Testing=Daten(round(p*m)+1:end,:);
XTrain=Training(:,1:n-2);
YTrain=Training(:,[3:4]);
XTest=Testing(:,1:n-2);
YTest=Testing(:,[3:4]);
cv=cvpartition(numel(YTrain),"HoldOut",1/3);
Define Hyperparameter to optimize
vars=[optimizableVariable('hiddenLayerSize',[1,20],"Type","integer");
optimizableVariable('lr',[1e-3 1],"Transform","log")];
Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results=bayesopt(minfn,vars,'IsObjectiveDeterministic',false,"AcquisitionFunctionName","expected-improvement-plus");
T=bestPoint(results)
Train Final Model on full training set using the best hyperparameters
net=fitnet(T.hiddenLayerSize,'traingd');
net.trainParam.lr=T.lr;
net=train(net,XTrain',YTrain');
ypred=net(XTest');
finalrmse=sqrt(mean((ypred-YTest').^2))
function rmse=kfoldLoss(x,y,cv,numHid,lr)
net = feedforwardnet(numHid, 'traingd');
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
ypred = net(x(:, cv.test));
rmse = sqrt(mean((ypred - y(cv.test)).^2));
end

サインインしてコメントする。

採用された回答

KSSV
KSSV 2022 年 1 月 29 日
You have straight away extended the single input method to two inputs method and messed with the dimensions. You need to check the dimensions of the input. This line:
cv=cvpartition(numel(YTrain),"HoldOut",1/3);
As you have used numel, it considers your input is 70*2 = 140 instead of 70. And while using cv for indexing, you are getting that error. Replace that line with:
cv=cvpartition(length(YTrain),"HoldOut",1/3);
The said error will be resolved. You may get errors later as well, check the dimensions properly.
  2 件のコメント
Saeed Magsi
Saeed Magsi 2022 年 1 月 31 日
Thank you very much. It worked for me. I was messing with the dimensions.
Shubham Baisthakur
Shubham Baisthakur 2023 年 3 月 8 日
Is it possible to extend this method to optimize the number of fully-connected layers in ANN architecture?

サインインしてコメントする。

その他の回答 (0 件)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by