Sparse Autoencoder with Adam optimization

6 ビュー (過去 30 日間)
Amy
Amy 2021 年 1 月 5 日
Hello!
I have a data set that contains 4 parts 1- Train Attribute( 121x125973 double ) , 2- Train Label (1x125973 double ), 3- Test Attribute(121x22544 double ) , 4- Test Label (1x22544 double) for NSL KDD dataset and it is ready to implement algorithem.
I applied sparse autoencoder and works with out any problem
options.Method = 'lbfgs' ;
options.maxIter = maxIter ;
options.useMex = 0 ;
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
But when I try to optimize the result using Adam optimizer I faced this problem " Unrecognized property 'GRADIENTDECAYFACTOR' for class 'nnet.cnn.TrainingOptionsADAM'.
this is my code
options = trainingOptions('adam', ...
'InitialLearnRate',3e-4, ...
'SquaredGradientDecayFactor',0.99, ...
'MaxEpochs',20, ...
'MiniBatchSize',64, ...
'Plots','training-progress');
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
I wonder how can apply sparse autoencoder with adam optimization ?

回答 (0 件)

カテゴリ

Help Center および File ExchangeEigenvalue Problems についてさらに検索

製品


リリース

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by