What is the activation in an LSTM and fully connected layer?

4 ビュー (過去 30 日間)
Christos Chrysafis
Christos Chrysafis 2018 年 7 月 9 日
回答済み: Astarag Chattopadhyay 2018 年 7 月 25 日
In the documentation, it is not clear what is the activation after an lstm or a fully connected layer. In an example the structure of the network was the following: -Sequence input -LSTM layer -LSTM layer -Fully Connected Layer -Regression Layer
Someone had a similar question and the verified answer was that the activations can be imported as individual layers (e.g reluLayer) but in the example above there are no reluLayers or something similar which means that the activations must be already inside the layers (e.g inside the LSTM layer). Could someone tell me what are those activations and if it is possible to change them?
  1 件のコメント
Christos Chrysafis
Christos Chrysafis 2018 年 7 月 10 日
I am using my gpu to train the network and I have seen that cudnn is used in that case and the activation used in cudnn files for lstm is tanh.

サインインしてコメントする。

採用された回答

Astarag Chattopadhyay
Astarag Chattopadhyay 2018 年 7 月 25 日
Hi Christos,
Long Short-Term Memory networks have tanh and sigmoid as the internal activation functions. You can see more details on that in the following documentation page:

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

製品


リリース

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by