Validation and test data show nan, regularization does not work, problems with divideblock.
古いコメントを表示
I am trying to classify large datasets (~80 - 100k samples). The problem is that data are time series where neighboring datapoints can be highly correlated, which seems to make block validation neccessary.
I am running multiple trials with increasing number of hidden neurons. I have googled "Greg Ntrials" but I did not quite understand how the recommendations transfer to a binary classification problem with 6 - 36 input layers and 80k samples.
Ntrials = [0 : 5 : 20];
for n =Ntrials
net = patternnet([length(X{1})+n]) ;
net.trainParam.showWindow =0;
net.trainFcn = 'trainlm';
net.performParam.regularization =0.1;
net = train(net,X,T,'useParallel','yes');
end
First of all, I always get the following message regardless even if I manually depict 'crossentropy' for the net.performFcn:
Warning: Performance function replaced with squared error performance.
Second, I sometimes get the following error:
Subscript indices must either be real positive integers or logicals
Error in divideblock>divide_indices (line 108)
Third, the nntraintool window will always show up and net.trainParam.showWindow will be set to 1 after training.
Fourth and most importantly, plotconfusion will always show NAN for both validation and testdata. I have tried various settings and also the very default settings and none of them seem to use validation.
Somehow I cannot get the patternnet to show validation results (confusion matrix shows NAN for validation and test sets whatever I do). Validation checks in nntraintool is showing 0 - 0.
Finally, when I choose regularization of 0.1 it will switch to trainbr. If I turn it off, it stays at trainlm, but still changes performFcn to mse.
I tried to thin out the training set X=X(1:10:end) & T = T(1:10:end) but the behavior stays the same.
Any help would be much appreciated. Thank you so much.
採用された回答
その他の回答 (0 件)
カテゴリ
ヘルプ センター および File Exchange で Deep Learning Toolbox についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!