how to include four hidden layers by taking away LSTM.

5 ビュー (過去 30 日間)
jaah navi
jaah navi 2021 年 6 月 20 日
コメント済み: jaah navi 2021 年 6 月 28 日
I am having a code that implements LSTM layer as below:
inputSize = 12;
numHiddenUnits1 = 48;
numHiddenUnits2 = 48;
numHiddenUnits3 = 48;
numHiddenUnits4 = 48;
numClasses = 12;
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(numHiddenUnits1,'OutputMode','sequence')
lstmLayer(numHiddenUnits2,'OutputMode','sequence')
lstmLayer(numHiddenUnits3,'OutputMode','sequence')
lstmLayer(numHiddenUnits4,'OutputMode','sequence')
fullyConnectedLayer(numClasses)
reluLayer
regressionLayer]
Now I want to implement four hidden layers without LSTM. So anyone can help me how to modify it.

採用された回答

Aparajith Raghuvir
Aparajith Raghuvir 2021 年 6 月 21 日
I understand you would like to add 4 fully connected hidden layers without LSTM.
The same fullyConnectedLayer() function can be used with your choice of hyperparemeters to achieve your aim. Please refer to this for the documentation of the abovementioned function.
I hope this helps.
  4 件のコメント
jaah navi
jaah navi 2021 年 6 月 22 日
Thanks. It works.
jaah navi
jaah navi 2021 年 6 月 28 日
Could you please help me how to add sine, cosine, tanh activation by replacing reLu layer for the above code.

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeBuild Deep Neural Networks についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by