Neural Network Output :Scaling the output range.
9 ビュー (過去 30 日間)
古いコメントを表示
Hi,
The output layer of my neural network (3 layered) is using sigmoid as activation which outputs only in range [0-1]. However, if I want to train it for outputs that are beyond [0-1], say in thousands, what should I do?
For example if I want to train
input ----> output
0 0 ------> 0
0 1 ------> 1000
1000 1 ----> 1
1 1 -------> 0
My program works for AND, OR, XOR etc. As input output are all in binary.
There were some suggestion to use,
Activation:
-----------
y = lambda*(abs(x)*1/(1+exp(-1*(x))))
Derivative of activation:
-------------------------
lambda*(abs(y)*y*(1-y))
This did not converge for the mentioned training pattern. Are there any suggestion please?
0 件のコメント
採用された回答
Greg Heath
2012 年 1 月 31 日
Hello Greg,
Thanks again for answering the question. For my case, no rigid bound,
1. INCORRECT. ALL 3 VARIABLES ARE BOUNDED:
0 <= X1, Y <= 1000
0<= X2 <= 1.
2. HOWEVER, SINCE THE INPUT SCALES ARE DIFFERENT BY A FACTOR OF THOUSAND, X1 AND Y SHOULD BE TRANSFORMED BY VIA LOGS AND/OR POWERS. E.G.,
X1n = LOG10( 1 + X1 ) / LOG10( 1001 ) ==> 0 <= X1n <= 1
SIMILARLY FOR Y
HOPE THIS HELPS.
GREG
その他の回答 (1 件)
Greg Heath
2012 年 1 月 29 日
If the target has rigid bounds, scale the data to either [0,1] or [-1,1] and use either LOGSIG or TANSIG, respectively.
Otherwise, standardize to zero-mean/unit variance and use PURELIN.
To recover the original data scale, just use the reverse tranformations.
Hope this helps.
Greg
参考
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!