Why is the first training result from patternnet different from subsequent training runs?

2 ビュー (過去 30 日間)
Imran Khan
Imran Khan 2022 年 2 月 13 日
編集済み: Imran Khan 2024 年 1 月 19 日
I run a patternnet 4 times, each time resetting the random number generator:
clear all
close all
clc
[x,t] = iris_dataset;
rng('default');
net1 = patternnet(10);
[net1, tr1] = train(net1,x,t);
figure(1);
plotperform(tr1);
rng('default');
net2 = patternnet(10);
[net2, tr2] = train(net2,x,t);
figure(2);
plotperform(tr2);
rng('default');
net3 = patternnet(10);
[net3, tr3] = train(net3,x,t);
figure(3);
plotperform(tr3);
rng('default');
net4 = patternnet(10);
[net4, tr4] = train(net4,x,t);
figure(4);
plotperform(tr4);
But get two different training record outputs (excuse the sizing and resolution of the images - I've just shrunk their screenshots in Word to put them into one image):
The difference is very small in terms of the CE and epoch time for the iris dataset. But for the dataset I am using, it is quite large for the best performing epoch (only using a training and validation dataset):
The epoch is important for what I'm doing because I want to run it with the parameters for the best performing epoch - the train() function does not return the best performing NN, but instead the NN for the last run epoch. What I am doing to get around this is training the NN for a specified number of epochs, then taking note of which is the best performing epoch, and then train it again to reach that epoch and then run my test dataset on it. So, knowing exactly which is the best performing epoch and being able to reproduce it is important (unless there is another way to have train() return the network at the best performing epoch which I don't know of).
What is the cause of this discrepancy between the first training run and the subsequent runs? Should the first training run results of pattern net always be discarded because they don't seem to be the best and don't seem to be replicated in the subsequent runs? Is this a bug?

回答 (1 件)

aditi bagora
aditi bagora 2024 年 1 月 16 日
Hello Imran,
I understand that you are trying to find the best performing epoch and since, the training output is different for different iterations there is an issue.
To answer your questions:
This is not a bug, there is no issue with random generator you can check it by running the following command multiple times:
rng('default');
rand(1,5);
You can notice the random numbers generated are same. So, the model is initialized with the same weights each time you set rng('default').
But there is a possibility that since the data division is random and training weights and biases gets updated adaptively the model might not coverge at the same epoch when re-trained.
Hope this helps!
Regards,
Aditi
  1 件のコメント
Imran Khan
Imran Khan 2024 年 1 月 19 日
編集済み: Imran Khan 2024 年 1 月 19 日
Hi Aditi,
Thanks for your reply. I ended up filing a bug report about this and received feedback from Mathworks technical support. While not "technically" a bug, there is unexpected/undocumented behaviour around patternnet (and by extension I think also in other algorithms that require some randomized initialization), which users should know about if they are doing something like repeating multiple patternnet runs like I was. See the response from mathworks technical support below:
Neural networks are deterministic in the sense that if the initial random conditions are the same, they will train to the same point. Regardless of the randomness associated with the data division and weights, bias updating, it is crucial that the model converges in exactly the same way if the random number generator is used with the same seed - that's how it is supposed to work, and does for all runs, just except for the first run. As a workaround in my script, I ended up just discarding the first run of my patternnet.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeBuild Deep Neural Networks についてさらに検索

製品


リリース

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by