Info

この質問は閉じられています。 編集または回答するには再度開いてください。

where exactly do i place this netc = closeloop(neto); netc = train(netc,X,Xoi,Aoi); in my code to avoid error

2 ビュー (過去 30 日間)
jeff amponsah
jeff amponsah 2017 年 1 月 25 日
閉鎖済み: Walter Roberson 2017 年 1 月 26 日
I have some questions with regards to training a closeloop of a narxnet from using the weights from open-loop. Following an answer greg gave on 4th november 2016 by using netc = closeloop(neto); netc = train(netc,X,Xoi,Aoi); where do i exactly place it in my code so as to avoid error because i have been trying it and keep on running into error when i place it at the closeloop session of my code. The code is given below. Any suggestions concerning my code are welcome so as to enable me do a good prediction. Thank you in advance. % Solve an Autoregression Problem with External Input with a NARX Neural Network % Script generated by Neural Time Series app % Created 04-Jan-2017 05:53:56 % % This script assumes these variables are defined: % % normtrip - input time series. % normw - feedback time series.
X = tonndata(normtrip,false,false); T = tonndata(normw,false,false);
% Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. Suitable in low memory situations. trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Nonlinear Autoregressive Network with External Input inputDelays = 1:1; feedbackDelays = 1:1; hiddenLayerSize = 10; net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Choose Input and Feedback Pre/Post-Processing Functions % Settings for feedback input are automatically applied to feedback output % For a list of all processing functions type: help nnprocess % Customize input parameters at: net.inputs{i}.processParam % Customize output parameters at: net.outputs{i}.processParam net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation % The function PREPARETS prepares timeseries data for a particular network, % shifting time by the minimum amount to fill input states and layer % states. Using PREPARETS allows you to keep your original time series data % unchanged, while easily customizing it for networks with differing % numbers of delays, with open loop or closed loop feedback modes. [x,xi,ai,t] = preparets(net,X,{},T);
% Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand'; % Divide data randomly net.divideMode = 'time'; % Divide up every sample net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
% Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse'; % Mean Squared Error
% Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ... 'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'}; rng('default') % Train the Network [net,tr] = train(net,x,t,xi,ai)
% Test the Network y = net(x,xi,ai); e = gsubtract(t,y); performance = perform(net,t,y) B=cell2mat(y); A=cell2mat(t); x=(1:1:67); figure plot(x,A,x,B) % Recalculate Training, Validation and Test Performance trainTargets = gmultiply(t,tr.trainMask); valTargets = gmultiply(t,tr.valMask); testTargets = gmultiply(t,tr.testMask); trainPerformance = perform(net,trainTargets,y) valPerformance = perform(net,valTargets,y) testPerformance = perform(net,testTargets,y)
% View the Network view(net);
% Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, ploterrhist(e) %figure, plotregression(t,y) %figure, plotresponse(t,y) %figure, ploterrcorr(e) %figure, plotinerrcorr(x,e)
% Closed Loop Network % Use this network to do multi-step prediction. % The function CLOSELOOP replaces the feedback input with a direct % connection from the outout layer. netc = closeloop(net); netc.name = [net.name ' - Closed Loop']; %will deal with later view(netc); [xc,xic,aic,tc] = preparets(netc,X,{},T); %netc = train(netc,x,xi,ai); yc = netc(xc,xic,aic); closedLoopPerformance = perform(net,tc,yc);
% Multi-step Prediction % Sometimes it is useful to simulate a network in open-loop form for as % long as there is known output data, and then switch to closed-loop form % to perform multistep prediction while providing only the external input. % Here all but 5 timesteps of the input series and target series are used % to simulate the network in open-loop form, taking advantage of the higher % accuracy that providing the target series produces: numTimesteps = size(x,2); knownOutputTimesteps = 1:(numTimesteps-5); predictOutputTimesteps = (numTimesteps-4):numTimesteps; X1 = X(:,knownOutputTimesteps); T1 = T(:,knownOutputTimesteps); [x1,xio,aio] = preparets(net,X1,{},T1); [y1,xfo,afo] = net(x1,xio,aio); % Next the the network and its final states will be converted to % closed-loop form to make five predictions with only the five inputs % provided. x2 = X(1,predictOutputTimesteps); [netc,xic,aic] = closeloop(net,xfo,afo); [y2,xfc,afc] = netc(x2,xic,aic); multiStepPerformance = perform(net,T(1,predictOutputTimesteps),y2) % Alternate predictions can be made for different values of x2, or further % predictions can be made by continuing simulation with additional external % inputs and the last closed-loop states xfc and afc.
% Step-Ahead Prediction Network % For some applications it helps to get the prediction a timestep early. % The original network returns predicted y(t+1) at the same time it is % given y(t+1). For some applications such as decision making, it would % help to have predicted y(t+1) once y(t) is available, but before the % actual y(t+1) occurs. The network can be made to return its output a % timestep early by removing one delay so that its minimal tap delay is now % 0 instead of 1. The new network returns the same outputs as the original % network, but outputs are shifted left one timestep. nets = removedelay(net); nets.name = [net.name ' - Predict One Step Ahead']; view(nets) [xs,xis,ais,ts] = preparets(nets,X,{},T); ys = nets(xs,xis,ais); stepAheadPerformance = perform(nets,ts,ys)
% Deployment % Change the (false) values to (true) to enable the following code blocks. % See the help for each generation function for more information. if (false) % Generate MATLAB function for neural network for application % deployment in MATLAB scripts or with MATLAB Compiler and Builder % tools, or simply to examine the calculations your trained neural % network performs. genFunction(net,'myNeuralNetworkFunction'); y = myNeuralNetworkFunction(x,xi,ai); end if (false) % Generate a matrix-only MATLAB function for neural network code % generation with MATLAB Coder tools. genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes'); x1 = cell2mat(x(1,:)); x2 = cell2mat(x(2,:)); xi1 = cell2mat(xi(1,:)); xi2 = cell2mat(xi(2,:)); y = myNeuralNetworkFunction(x1,x2,xi1,xi2); end if (false) % Generate a Simulink diagram for simulation or deployment with. % Simulink Coder tools. gensim(net); end %B=cell2mat(yc); %A=cell2mat(t); %x=(1:1:67); %figure %plot(x,A,x,B)
  2 件のコメント
Greg Heath
Greg Heath 2017 年 1 月 25 日
It would help if you
1. Separated your code from the comments by line spaces.
2. Eliminate copious commenting directly obtained from the documentatation. Brief 1 or 2 line comments in your own words will suffice.
3. Apply your code to the documentation example data to demonstrate your errors.
Thanks.
Greg

回答 (1 件)

jeff amponsah
jeff amponsah 2017 年 1 月 26 日
Thanks greg.
The question is
1. where do i place netc = closeloop(neto); netc = train(netc,X,Xoi,Aoi); in order to train closeloop using weights from open-loop. i dont know exactly where to place it the script. This is the error:
Error using network/train (line 340) Inputs and targets have different numbers of samples.
Error in ab (line 100) (% ab is the matlab script name) netc = train(netc,x,xi,ai)
2. I dont really understand how to go about your third suggestion but i am working on a time series with 68 data points for both input(normtrip) and output(normw) values or is that what i have described in 1.
3. The script is shown below once again: % X and T input and input series and output series variables respectively
X = tonndata(normtrip,false,false); T = tonndata(normw,false,false);
% training function
trainFcn = 'trainlm';
% Creating narx network
inputDelays = 1:1; feedbackDelays = 1:1; hiddenLayerSize = 10; net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Input and Feedback Pre/Post-Processing Functions
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
[x,xi,ai,t] = preparets(net,X,{},T);
% Setup Division of Data for Training, Validation, Testing.
net.divideFcn = 'dividerand'; % Divide data randomly net.divideMode = 'time'; % Divide up every sample net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
% mse is chosen as performance function
net.performFcn = 'mse'; % Mean Squared Error
% Plot Functions.
net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ... 'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'};
% initializing weights
rng('default')
% Train the Network
[net,tr] = train(net,x,t,xi,ai)
% Test the Network
y = net(x,xi,ai); e = gsubtract(t,y); performance = perform(net,t,y)
% a plot of predicted series y and target(observed)series
B=cell2mat(y); A=cell2mat(t); x=(1:1:67); figure plot(x,A,x,B)
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask); valTargets = gmultiply(t,tr.valMask); testTargets = gmultiply(t,tr.testMask); trainPerformance = perform(net,trainTargets,y) valPerformance = perform(net,valTargets,y) testPerformance = perform(net,testTargets,y)
% View the Network
view(net);
% Closed Loop Network
netc = closeloop(net); netc.name = [net.name ' - Closed Loop']; view(netc); [xc,xic,aic,tc] = preparets(netc,X,{},T); netc = train(netc,x,xi,ai); (% line 100 where the error occurs where exactly shoult it be placed ) yc = netc(xc,xic,aic); closedLoopPerformance = perform(net,tc,yc);
% Multi-step Prediction, all but 5 timesteps of the input series and output series are used to simulate the network in open loop.
numTimesteps = size(x,2); knownOutputTimesteps = 1:(numTimesteps-5); predictOutputTimesteps = (numTimesteps-4):numTimesteps; X1 = X(:,knownOutputTimesteps); T1 = T(:,knownOutputTimesteps); [x1,xio,aio] = preparets(net,X1,{},T1); [y1,xfo,afo] = net(x1,xio,aio);
% The network and its final states are converted to closed-loop form to make five predictions with only the five inputs provided.
x2 = X(1,predictOutputTimesteps); [netc,xic,aic] = closeloop(net,xfo,afo); [y2,xfc,afc] = netc(x2,xic,aic); multiStepPerformance = perform(net,T(1,predictOutputTimesteps),y2)
% Step-Ahead Prediction Network
nets = removedelay(net); nets.name = [net.name ' - Predict One Step Ahead']; view(nets) [xs,xis,ais,ts] = preparets(nets,X,{},T); ys = nets(xs,xis,ais); stepAheadPerformance = perform(nets,ts,ys)
% Changing the (false) values to (true) to enable the following code blocks.
if (false)
genFunction(net,'myNeuralNetworkFunction'); y = myNeuralNetworkFunction(x,xi,ai); end if (false)
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
x1 = cell2mat(x(1,:));
x2 = cell2mat(x(2,:));
xi1 = cell2mat(xi(1,:));
xi2 = cell2mat(xi(2,:));
y = myNeuralNetworkFunction(x1,x2,xi1,xi2);
end
if (false)
gensim(net);
end
% a plot of predicted series from closeloop and target(observed) series.
B=cell2mat(yc); A=cell2mat(t); %x=(1:1:67); figure plot(x,A,x,B)
Thank you for the suggestions. I hope it is now clearer.
  1 件のコメント
jeff amponsah
jeff amponsah 2017 年 1 月 26 日
編集済み: jeff amponsah 2017 年 1 月 26 日
This is not the answer i was putting the question in a more clearer form and mistakenly placed it here,

この質問は閉じられています。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by