neural network hyperparameter tuning

46 ビュー (過去 30 日間)
Dimitri
Dimitri 2018 年 11 月 6 日
Hello,
since there is no hyperparameter tuning function for neural network I wanted to try the bayesopt function. I tried to recreate the example here: https://de.mathworks.com/help/stats/bayesian-optimization-case-study.html. But this does not work. Is there a possibility to tune the number of hidden neurons? My code does not work...
[m,n] = size(Daten) ;
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain=Training(:,1:n-1);
YTrain=Training(:,n);
XTest=Testing(:,1:n-1);
YTest=Testing(:,n);
c = cvpartition(YTrain,'KFold',10);
hiddenLayerSize=optimizableVariable('hiddenLayerSize',[0,20]);
minfn = @(z)kfoldLoss(fitnet(XTrain,YTrain,'CVPartition',c,...
'hiddenLayerSize',z.hiddenLayerSize));
results = bayesopt(minfn,hiddenLayerSize,'IsObjectiveDeterministic',true,...
'AcquisitionFunctionName','expected-improvement-plus');

採用された回答

Don Mathis
Don Mathis 2018 年 11 月 17 日
If you want a more complete workflow that also optimizes the learning rate, and tests the final model on your test set, you could try this:
% Make some data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1); % Minimum asymptotic error is .1
[m,n] = size(Daten) ;
% Split into train and test
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain = Training(:,1:n-1);
YTrain = Training(:,n);
XTest = Testing(:,1:n-1);
YTest = Testing(:,n);
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results)
% Train final model on full training set using the best hyperparameters
net = feedforwardnet(T.hiddenLayerSize, 'traingd');
net.trainParam.lr = T.lr;
net = train(net, XTrain', YTrain');
% Evaluate on test set and compute final rmse
ypred = net(XTest');
finalrmse = sqrt(mean((ypred - YTest').^2))
function rmse = kfoldLoss(x, y, cv, numHid, lr)
% Train net.
net = feedforwardnet(numHid, 'traingd');
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
rmse = sqrt(mean((ypred - y(cv.test)).^2));
end
  6 件のコメント
Saeed Magsi
Saeed Magsi 2022 年 1 月 27 日
Thank you @Don Mathisfor your solution. I tried this solution on my data but it has given me an error [" The Logical indices in position 2 contain a true value outside of the array bounds"]. I actually have two outputs. And this solution is not working on two outputs. It only works on single output. Can you help me in solving the problem. Thanks.
SAIF MEHDI
SAIF MEHDI 2022 年 8 月 10 日
Most of these solvers are single objective functions. For your problem, you need a multi objective solver. I know two of them multiobjective GA and Pareto front. You would have to go through their help documents to understand the syntax.

サインインしてコメントする。

その他の回答 (2 件)

Sean de Wolski
Sean de Wolski 2018 年 11 月 6 日
編集済み: Sean de Wolski 2018 年 11 月 6 日
This is nowhere near as easy as it should be. The shallow neural net infrastructure is old and uses row-major variables. This needs to be accounted for and you'll see it below with a ton of.' transposes. Second, you'll need to wrap around fitnet because it doesn't take in all of the options as name-value pairs like with the modern fit* functions in the statistics toolbox. Third, the training is non-deterministic unless you seed the rng yourself.
I don't understand the math behind using kfold cross validation with a neural net. Hence, I'll use holdout below which will reliably train and evaluate the network on an independent test sets.
Daten = rand(100, 3);
[m,n] = size(Daten) ;
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain=Training(:,1:n-1).'; % Note transposes
YTrain=Training(:,n).';
XTest=Testing(:,1:n-1).';
YTest=Testing(:,n).';
c = cvpartition(numel(YTrain),'Holdout', 0.25);
hiddenLayerSize=optimizableVariable('hiddenLayerSize',[1,20], 'Type', 'integer');
minfn = @(z)wrapFitNet(XTrain,YTrain, 'CVPartition', c, ...
'hiddenLayerSize',z.hiddenLayerSize);
results = bayesopt(minfn,hiddenLayerSize,'IsObjectiveDeterministic',false,...
'AcquisitionFunctionName','expected-improvement-plus');
Wrapper function
function cvrmse = wrapFitNet(x, y, varargin)
% Handle variable inputs
ip = inputParser;
ip.addParameter('hiddenLayerSize', 20);
ip.addParameter('CVPartition', cvpartition(numel(y),'Holdout', 0.10));
parse(ip, varargin{:});
cv = ip.Results.CVPartition;
hiddensz = ip.Results.hiddenLayerSize;
% Train net. You would adjust other hyper parameters here.
net = fitnet(hiddensz);
nets = train(net, x(:, cv.training.'), y(:, cv.training.'));
% Evaluate on test set and compute rmse
ypred = nets(x(:, cv.test.'));
cvrmse = sqrt(sum(ypred-y(cv.test.').^2)/numel(y(cv.test)));
end
Finally, if the only thing you want to optimize is hidden layer size, it may be easiest to just run a loop from 1:20 and try them all. Bayesian optimization really helps when you have many different parameters (trainfcn, etc.)
  4 件のコメント
Raghu
Raghu 2021 年 6 月 29 日
Shubham Baisthakur
Shubham Baisthakur 2023 年 3 月 8 日
Is it possible to extend this method to optimize the number of fully-connected layers as well?

サインインしてコメントする。


Dimitri
Dimitri 2018 年 11 月 10 日
I'm sorry to bother you again, but I'm having trouble with your code. If the code runs through I get the following answer:
Additionally he doesn't plot any curves at bayesian optimization, which probably has to do with the error. I didn't change anything in your code. Can you help me again, please?
Dimitri
  6 件のコメント
Ali
Ali 2020 年 3 月 7 日
Madushan Rathnayaka
Madushan Rathnayaka 2022 年 2 月 22 日
how do we extend this to other parameters?

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeModel Building and Assessment についてさらに検索

製品


リリース

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by