フィルターのクリア

How to provide training data to the neural network?

1 回表示 (過去 30 日間)
Vineet
Vineet 2013 年 4 月 23 日
I created a feed forward neural network using the newff function.
The code is below:
net=newff(P,T, [5 5], {'tansig', 'purelin'},'trainlm', 'learngdm');
net.trainParam.show = 10; %showing results after every 10 iterations
net.trainParam.lr = 0.01; %learning rate
net.trainParam.epochs = 50; %no. of iterations
net.trainParam.goal = 0.0001; % percentage error goal
net1 = train(net, P, T);%training the network
I need to know as to how do I provide the testing data after the training is done.
  1 件のコメント
Greg Heath
Greg Heath 2013 年 4 月 23 日
If you can, change the title to use "additional testing" instead of "training". The adjective "additional" recognizes that the default data division already creates a nondesign testing subset.

サインインしてコメントする。

採用された回答

Greg Heath
Greg Heath 2013 年 4 月 23 日
Always start with default values. If they don't work, change one at a time.
You have two hidden layers. One is sufficient.
close all, clear all, clc
[ p, t ] = simplefit_dataset;
whos
% Name Size Bytes Class
% p 1x94 752 double
% t 1x94 752 double
[ I N ] = size( p) % [ 1 94 ]
[ O N ] = size( t) % [ 1 94 ]
Ntst = round(0.15*N) % 14 default data division
Nval = Ntst % 14
Ntrn = N-2*Ntst % 66
Ntrneq = Ntrn*O % 66 No. of training equations
% Nw = (I+1)*H+(H+1)*O % No. of unknown weights
% Ntrneq > Nw when H < = Hub
Hub = -1 + ceil( ( Ntrneq-O)/(I+O+1) ) % 21
H = 10 % Choose default value
Nw = O + (I+O+1)*H % 31 unknown weights
% Initialize RNG so default random data division and random initial weights can be duplicated
rng(0)
net = newff( p, t, H );
view(net)
[ net tr y0 e0 ] = train( net, p, t );
y = net(p); % same as y0
e = t-y; % same as e0
MSE = mse(e) % 4.988e-8
tr = tr % training details
% New data
ynew = net(pnew);
Hope this helps.
Thank you for formally accepting my answer
Greg
P.S. See tr for the separate training, validation and test results !

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeDeep Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by