Coefficent of determination in a nonlinear system
4 ビュー (過去 30 日間)
古いコメントを表示
I'm having a little bit of a problem figuring out how to find R^2. I have used lsqnonlin to find three variable coefficients in a nonlinear equation. I need to know how to find the R^2 using these and I can not find any way to do it. Its most likely something simple that I have missed but either way I need a little push in the right direction. Thank you.
0 件のコメント
回答 (1 件)
John D'Errico
2023 年 12 月 25 日
編集済み: John D'Errico
2023 年 12 月 25 日
R^2 is not always a meaningful thing to compute for nonlinear regression models. On some problems (those with no constant coefficient to estimate) you might even find it results in a negative number. Worse, I have often explained (ok, some might have called what I said a rant) the serious problems R^2 has in general. You should never be making decisions about a fit using only one number. Plot the curve, plot the data. Are you happy with the result? If so, then be happy, and don't worry about a number.
However, it is not a difficult thing to compute in theory. Essentially, R^2 tells how well the fit did when compared to a simple constant model. How much of the variation in the data did your model explain? Really, when you understand the concept of R^2, computing it is literally trivial. (There are also various adjusted forms of R^2, but I'll stop here.) I'll give an example below, for a simple nonlinear model.
First, make up some data. I'll give it a lot of noise.
x = sort(rand(100,1)); % the sort is merely to make the plots nice
y = 1.5 + sin((x - 1.2)*1.25) + randn(size(x))/20;
plot(x,y,'o')
Now, of course you know the true underlying model, since I made up the data in front of you. But pretend you have no clue. In that case, an exponential model might seem vaguely appropriate, just looking at the curve. Honestly, that data is so crappy, a quadratic polynomial should be entirely adequate given the noise.
Here is a simple exponential model, as a function of two parameters.
Fexp = @(x,parms) parms(1)*exp(x*parms(2));
We are going to use lsqnonlin, so I'll do this in a way that will accept different models as one may choose.
fitfun = @(mdl,parms) mdl(x,parms) - y;
Now use lsqnonlin to extimate the parameters.
parms0 = [1 1]; % not a very intelligent choice, but it will be adequate here
[parmsexp,RESNORM,RESIDUAL,EXITFLAG] = lsqnonlin(@(parms) fitfun(Fexp,parms),parms0)
An exitflag of 3 is acceptable. Large residiual nonlinear least squares is always problematic. PLot the data and the model here...
plot(x,y,'ko',x,Fexp(x,parmsexp),'r-')
The fit does not seem unreasonable, though I would argue there is some lack of fit. A quadratic polynomial is probably a better choice, but that model would have 3 parameters to estimate, so it probably will be slightly better.
Time to compute R^2 now for this model.
SStot = sum((y - mean(y)).^2)
SSres = sum((y - Fexp(x,parmsexp)).^2)
R2exp = 1 - SSres/SStot
An R^2 of 0.96 is not bad, but not great either. In the end, I'll still argue that a better measure is just what my eyes tell me. This model suffers a little from lack of fit, and I can see that by eye. The bottom end of the curve is not quite right.
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Interpolation についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!