How to use a Leaky Relu/Softmax function in a hidden layer in a Feedforward Neural Network?

30 ビュー (過去 30 日間)
Ihsan Ullah
Ihsan Ullah 2019 年 4 月 3 日
回答済み: Abhishek Tiwari 2022 年 7 月 10 日
Hi.
I am using a feedforward neural network with an input, a hidden, and an output layer. I want to change the transfer function in the hidden layer to Leakyrelu but the usual command (given below for a poslin transfer function) is not working
net.layers{1}.transferFcn = 'poslin'; % this command is working for poslin
Please suggest the command for changing the transfer function in layer 1 to a leakyrelu. Kindly also suggest the command to change the output layer transfer function to a softmax in a feedforward neural network.
Thank you
Ihsan

回答 (1 件)

Abhishek Tiwari
Abhishek Tiwari 2022 年 7 月 10 日
Hi,
The following is a list of all relevant transfer functions:
% compet - Competitive transfer function.
% elliotsig - Elliot sigmoid transfer function.
% hardlim - Positive hard limit transfer function.
% hardlims - Symmetric hard limit transfer function.
% logsig - Logarithmic sigmoid transfer function.
% netinv - Inverse transfer function.
% poslin - Positive linear transfer function.
% purelin - Linear transfer function.
% radbas - Radial basis transfer function.
% radbasn - Radial basis normalized transfer function.
% satlin - Positive saturating linear transfer function.
% satlins - Symmetric saturating linear transfer function.
% softmax - Soft max transfer function.
% tansig - Symmetric sigmoid transfer function.
% tribas - Triangular basis transfer function.
net.layers{1}.transferFcn = 'poslin';

製品


リリース

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by