How to see the number of parameters and effective parameters on a neural network trained with trainbr?

24 ビュー (過去 30 日間)
Hi everyone, I'm training TDNN neural networks for reactive energy prediction. I have trained several networks and have found that the ones that provide the best results are the ones you train with trainbr.
Now, however, I find myself in front of results that I cannot explain, I have trained a network with 20 neurons and 48 delays that has carried out 1000 iterations giving me results, albeit not excellent, at least satisfactory on both the data sets I use.
I subsequently trained a second network with a delay of 72 and 36 neurons in total, I always refer to the neurons in the hidden layers. In this case, however, I stopped the training manually after about 300 iterations. But are the results for output_2019 really bad? Because?
Using the trainbr I should be able to find the number of actual parameters that are used by the network compared to the total ones. How do I see what the actual and total metrics are in my network?
The fact that the results of the second trained network (for 300 iterations) are poorer than the one with fewer neurons, what is the reason? I must point out that graphically the performance of the network seems to be even better for the second trained network. Would anyone be able to kindly answer me?
I am attaching the type of script I use.
% input - input time series.
% target - target time series.
load('input_new');
load('target_new');
load('meanNew_ER');
load('mean_ER_2019');
load('target_2019');
load('input_2019');
X = tonndata(input_new,false,false);
T = tonndata(target_new,false,false);
input_2019 = tonndata(input_2019,false,false);
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainbr'; % Bayesian Regularization backpropagation.
% Create a Time Delay Network
inputDelays = 1:48;
hiddenlayer1.size = 10;
hiddenlayer1.transferfcn = 'logsig';
hiddenlayer2.size = 10;
hiddenlayer2.transferfcn = 'tansig';
hiddenLayerSize = [hiddenlayer1.size,hiddenlayer2.size];
net = timedelaynet(inputDelays,hiddenLayerSize,trainFcn);
% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess
net.input.processFcns = {'removeconstantrows','mapminmax'};
net.output.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer
% states. Using PREPARETS allows you to keep your original time series data
% unchanged, while easily customizing it for networks with differing
% numbers of delays, with open loop or closed loop feedback modes.
[x,xi,ai,t] = preparets(net,X,T);
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivision
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'time'; % Divide up every sample
net.divideParam.trainRatio = 75/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 10/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean Square Error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ...
'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y);
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask);
valTargets = gmultiply(t,tr.valMask);
testTargets = gmultiply(t,tr.testMask);
trainPerformance = perform(net,trainTargets,y);
valPerformance = perform(net,valTargets,y);
testPerformance = perform(net,testTargets,y);
% View the Network
view(net)
output_2019 = net(input_2019);
output_2019 = transpose(cell2mat(output_2019));
%calcolo errore medio assoluto
error = transpose(cell2mat(e));
mean_abs_e = mean(abs(error));
percent_e = (mean_abs_e/meanNewER)*100
error_2019 = mean(abs(target_2019 - output_2019))
percent_e_2019 = (error_2019/mean_ER_2019)*100
% Plots
output = transpose(cell2mat(y));
target = transpose(cell2mat(t));
x = linspace(0,87528,87528);
x1 = linspace(0,17520,17520);
% Uncomment these lines to enable various plots.
figure, plotperform(tr)
figure, plottrainstate(tr)
figure, ploterrhist(e)
figure, plotregression(t,y)
figure, plotresponse(t,y)
figure, ploterrcorr(e)
figure, plotinerrcorr(x,e)
figure
plot(x,target,x,output)
xlabel('quarter hour')
ylabel('reactive')
title('grafico target / output')
figure
plot(x1,target_2019,x1,output_2019)
xlabel("quarter hour")
ylabel("Reactive Power")
title("Serie storica potenza reattiva 2019")

回答 (1 件)

Prateek Rai
Prateek Rai 2022 年 6 月 1 日
Hi,
To my understanding, you are training TDNN neural networks using 'trainbr' and want to know more about the network.
'trainbr' also provides a 'Training record' as one of the output arguments which is a structure whose fields depend on the network training function (net.NET.trainFcn). It includes fields such as training, data division, and performance functions and parameters which helps in understanding the network better.
You can refer to the "Output Arguments" section of the 'trainbr' MathWorks Documentation page to learn more.

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by