Hello every one I am try to make neural network to predict permeability in oil field from wireline log so I have 5 input and one target I normaliz data to -1 and 1 but when I train network did not give me good r2 just 0.58 some time 0.62 what I do to reach r2 0.92 Many thanks muhammed

2 件のコメント

Greg Heath
Greg Heath 2014 年 10 月 28 日
Insufficient info.
size(input) [5 N ]
size(output) [ 1 N ]
Type of net fitnet ??
range of hidden Hmin:dH:Hmax ??
number of random weight trials for each value of H Ntrials ??
Any other non-default parameters??
muhammed kareem
muhammed kareem 2014 年 11 月 6 日
thanks dear Greg my input is 5*251 my target is 1*251......... i use newff ...... i do not know how range my hidden and weight>>>>>>>>>>>>>>
this is my code
net = newff(p, t, 20, {'logsig', 'purelin'});
net.divideParam.trainRatio = 75/100;
net.divideParam.testRatio = 15/100;
net.divideParam.valRatio = 10/100;
net.trainParam.epochs = 400;
net.trainParam.goal = 0.000001;
net.trainParam.max_fail = 200;
net.trainParam.lr = 0.06;
[net tr] = train(net,p,t);
a=sim(net,test);
postreg(a,tt);

サインインしてコメントする。

 採用された回答

Greg Heath
Greg Heath 2014 年 11 月 7 日

1 投票

The optimal value for H is usually obtained by trial and error. For each candidate value for H design Ntrials nets with different random initial weights.
I have posted many, many examples using a double for loop over H and weight initializations. Search NEWSGROUP and ANSWERS for the latest examples using
greg fitnet Ntrials
If you have the current functions FITNET (regression/curve-fitting)and PATTERNNET(classification/pattern-recognition) that automatically call FEEDFORWARDNET, use them instead of the OBSOLETE functions NEWFIT and NEWPR that automatically call NEWFF.
Hope this helps.
Thank you for formally accepting my answer
Greg

その他の回答 (0 件)

カテゴリ

ヘルプ センター および File ExchangeDeep Learning Toolbox についてさらに検索

質問済み:

2014 年 10 月 28 日

回答済み:

2014 年 11 月 7 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by