"Relu" activation function and "Adam" optimizer for Time delay neural network

26 ビュー (過去 30 日間)
Abdelwahab Afifi
Abdelwahab Afifi 2020 年 6 月 14 日
コメント済み: Abdelwahab Afifi 2020 年 6 月 14 日
I wanna design Time delay neural network, but I can't find the leaky rectified linear unit (Relu) activation function and "Adam" optimization algorithm in such type of networks (Time delay neural network).
  2 件のコメント
Image Analyst
Image Analyst 2020 年 6 月 14 日
Not all deep learning modules are shipped with the toolbox. Did you try to search the "Add-on Explorer"?
Abdelwahab Afifi
Abdelwahab Afifi 2020 年 6 月 14 日
I already add Deep Learning Toolbox but I don't know how to integrate the relu activation function and "Adam" optimizer within the "Time Delay neural network" structure.
As the time delay neural network is defined as follow
Net = timedelaynet(inputDelays,hiddenSizes,trainFcn)
where the trainFcn is limited to the folloeing types:
'trainlm'
Levenberg-Marquardt
'trainbr'
Bayesian Regularization
'trainbfg'
BFGS Quasi-Newton
'trainrp'
Resilient Backpropagation
'trainscg'
Scaled Conjugate Gradient
'traincgb'
Conjugate Gradient with Powell/Beale Restarts
'traincgf'
Fletcher-Powell Conjugate Gradient
'traincgp'
Polak-Ribiére Conjugate Gradient
'trainoss'
One Step Secant
'traingdx'
Variable Learning Rate Gradient Descent
'traingdm'
Gradient Descent with Momentum
'traingd'
Gradient Descent

サインインしてコメントする。

回答 (0 件)

カテゴリ

Help Center および File ExchangeDeep Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by