How to avoid getting negative values when training a neural network?

81 ビュー (過去 30 日間)
Mostafa Nakhaei
Mostafa Nakhaei 2020 年 1 月 18 日
回答済み: Mostafa Nakhaei 2020 年 1 月 30 日
Is there anyway to constrain the network results when we train a feed forward neural network in Matlab?
I am trying to train a supervised feed forward neural network with 100,000 observations. I have 5 continues variables and 3 countinues responses (labels). All my values are positive (labels and variables). However, when I train the network, sometimes it predicts negative results no matter what architecture I use. Negative results does not have any physical meaning and should not apear. Is there anyway to constrain the network? I also used reLU activation function for the last layer but the network cannot generalize well.


Mostafa Nakhaei
Mostafa Nakhaei 2020 年 1 月 30 日
I found the answer for my problem. The main reason for getting negative results after I trained and tested the dataset with positive numbers was that the distribution of new dataset was different from those of train and test samples. They had more noise. In my case, the solution was not to change the activation functions of the last layer (it leaded to physically meaningless results) but to add some syntatic random noise to my dataset. This robusted the model against the noise.

その他の回答 (1 件)

Greg Heath
Greg Heath 2020 年 1 月 18 日
Use a sigmoid for the output layer.
Hope this helps
  1 件のコメント
Mostafa Nakhaei
Mostafa Nakhaei 2020 年 1 月 18 日
Thanks Greg for the response.
This is the regression problem and also I guess sigmoid would give negative results as well.r



Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索


Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by