MATLAB gives me different value of output every time I train a neural network, why?

3 ビュー (過去 30 日間)
I was doing multilayer neural network. Input data (3 input data and 150 samples) - 3x150 target - 1x150
I did not specify the weight and bias, is it the reason to return different value of output every time I train the neural network?

採用された回答

Greg Heath
Greg Heath 2015 年 7 月 2 日
The default data division and weight initialization are both random.
To reproduce a design you have to know the initial state of the RNG before it is both configured with initial weights and divided into training, validation and testing subsets.
When designing multiple nets in a double for loop (creation in the outer loop and training in the inner loop), you only have to initialize the RNG once: before the first loop. The RNG changes its state every time it is called. Therefore, for reproducibility, record the RNG state at the beginning of the inner loop.
Exactly when the RNG is called differs for the different generation of designs. For special cases of the obsolete NEWFF family (e.g., NEWFIT, NEWPR and NEWFF), weights are initialized when the nets are created. For special cases of the current FEEDFORWARDNET family, (e.g., FITNET, PATTERNNET and FEEDFORWARDNET), weights can be initialized explicitly by the CONFIGURE function. Otherwise, they will be automatically initialiized by the function TRAIN.
When I find out exactly where the data is divided, I will post in both the NEWSGROUP and ANSWERS.
Hope this helps.
Thank you for formally accepting my answer
Greg

その他の回答 (1 件)

Walter Roberson
Walter Roberson 2015 年 7 月 1 日
The weights are initialized randomly unless you specifically initialize them.

カテゴリ

Help Center および File ExchangeDeep Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by