How can I trust my result which is changing according to train,test and validation set choice in NeuralNetwork?
1 回表示 (過去 30 日間)
古いコメントを表示
Hi, Now I'm using the NeuralNetwork Pattern recognition.
In my case, I have a 60 subjects x 700 features data, and it has 2 class labels.
Using dividerand Function, I separated the subjects with train 70%, test 15%, validation 15%
As the result, The accuracy always changes according to subject select in train, test and validation set.
How can I trust my result which is changing according to train,test and validation set choice in NeuralNetwork ?
As I know, model performance will be trust throught the divided train, test and validation set.
Then, I can used the max accuracy ?
of After the repeat, and using the average of results?
I need the help, please..!
thank you !
0 件のコメント
回答 (1 件)
Prince Kumar
2022 年 1 月 21 日
Hi,
This generally happens when you do not have a pre-defined train, validation and test set for training and testing the model. As you keep changing the data distribution between these sets, the result will vary accordingly. In such cases try to do a k-fold cross validation.
Please refer to the following documentations for more clarity
Hope this helps!
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!