フィルターのクリア

How to use the genetic algorithm to optimize "trainingOptions" for trainNetwork?

3 ビュー (過去 30 日間)
Renita Raidoo
Renita Raidoo 2020 年 6 月 29 日
Hi all, I am trying to use the genetic algorithm optimizer to tune hyperparameters in a neural network made using the deep network designer. When setting up the objective function, I can't seem the make the InitialLearnRate one of the hyperparameters to optimise. As far as I understand, the trainingOptions function requries the learn rate to be a postive integer, but how would I set it up if I want the genetic algorithm to vary that? I have attached the section of code, any help will be greatly appreciated!
function f = SeqFunction(p)
% Hyperparams for the optimization
numHiddenUnits = round(p(1));
LearnRate = round(p(2));
%Define the network
numFeatures = 7;
numResponses = 1;
layers = [...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits, 'OutputMode',"sequence")
dropoutLayer(0.2)
lstmLayer(numHiddenUnits, 'OutputMode',"sequence")
%dropoutLayer(0.2)
fullyConnectedLayer(numResponses)
regressionLayer];
options = trainingOptions('adam', ...
'MaxEpochs',100, ...
'GradientThreshold',1, ...
'InitialLearnRate',LearnRate, ... %This is the line I am struggling with
'Shuffle', 'never', ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',125, ...
'LearnRateDropFactor',0.9, ...
'Verbose',0, ...
'Plots','training-progress');
% Train the network
net = trainNetwork(XTrain,YTrain,layers,options);

回答 (0 件)

カテゴリ

Help Center および File ExchangeGenetic Algorithm についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by