# Regularization for Non-linear fitting with fitnlm.

17 ビュー (過去 30 日間)
Carlos Amador 2019 年 11 月 12 日
Edited: John D'Errico 2019 年 11 月 14 日
I'm trying to fit a non-linear model with many possible coefficients using fitnlm. The amount of data i have is limited. When i use polynomial empirical models i tend to you stepwise regression to find put those coefficients that are most important (reduced number of coefficients that fit most of the variance). However with fitnlm or any other function in Matlab for non-linear fitting will fit all coefficients leading to overfitting. Matlab provides the p values for the different coefficients but since it is fitting all coefficients we can't use those p values to drop coefficients out. is there a way to combine regularization techniques for parameter selection with the fitnlm function which uses least squares fitting?

#### 2 件のコメント

Adam Danz 2019 年 11 月 12 日
Could you do the fitting twice such that upon the first pass, you obtain the p values and the parameter estimates, then upon the second pass, you choose the significant parameters and provide the inputs to the parameters you're leaving out?
Carlos Amador 2019 年 11 月 14 日
Hi Adam, thanks for suggestion. I discussed this option with Matlab support but showed that it actually cannot be done like this because if you start with all coefficients the order of the p-values is not necesarilly the same as if you start removing terms.

サインイン to comment.

### 件の回答 (3)

John D'Errico 2019 年 11 月 14 日

But how can you do a simple regularization on the coefficients of a polynomial model? Its probably a really bad idea. But ... you can do it. However, a far better choice is to not use a high order polynomial at all. Use a regression spline.
First, lets talk about what it means, in terms of the linear algebra. A classical regularization (the name ridge regression is common) is in the form of a sum of squares of the residuals for the model
sum((P(x,C) - y).^2) - lambda*(sum(C.^2))
(I should make the effort to remember how to use LaTeX for things like that. Too lazy. Sigh. You get the idea though.)
where C are the polynomial coefficints, and P(x,C) is a polynomial model,
We can write that in the form of linear algebra using a partial vandermonde matrix, only taking orders up to the order polynomial that we want. Assume we want to build a polynomial model of order N, using a regulariation parameter of lambda.
First formulate the problem in the form A*C =y, with C as the set of regression coefficients.
A = x(:).^(N:-1:0);
The matrix is is such that if we now solve the problem as
C = A\y(:);
this will minimize the norm(A*C - y), so it is a regression fit. (Please don't tell me you want to solve the problem using the normal equations, thus C = inv(A'*A)*A'*y. That is a really bad idea too. Use backslash, as I show.
Regardless, that is the standard polynomial model, with no regularization. The trick now becomes to solve for C as
C = [A;lambda*eye(N)]\[y(:);zeros(N,1)];
This estimates the regularized model. Again, some people will tell you to use this form:
C = inv(A'*A + lambda*eye(N))*A'*y(:)
Again, a really bad idea. USE BACKSLASH, as I showed above. Here is some code;
function C = regpolyfit(x,y,N,lambda)
% fits a regularized polynomial model of degree N, with regularization coefficient lambda
% partial vandermonde matrix, of order N.
% note that the coefficients are in the standard polyfit form,
% with the highest order term first.
A = x(:).^(N:-1:0);
% solve, using backslash, applied to the augmented matrix.
% remember there are N+1 coefficients in a degreee N polynomial.
C = [A;lambda*eye(N+1)]\[y(:);zeros(N+1,1)];
% return the coefficients as a row vector, consistent with polyfit
C = C.';
end
For example,
x = linspace(-2*pi,2*pi);
y = cos(x);
P10 = polyfit(x,y,10)
P10 =
-1.1146e-07 -4.2283e-20 1.8711e-05 3.2085e-18 -0.0012777 -7.9263e-17 0.040727 7.0408e-16 -0.49708 -1.6642e-15 0.99853
RP10 = regpolyfit(x,y,10,1)
RP10 =
2.6172e-08 1.5634e-20 4.439e-06 -1.1833e-18 -0.0007458 2.9709e-17 0.032217 -2.7161e-16 -0.4435 6.1345e-16 0.91365
As you can see, for this 10'th degree polynomial model, the coefficients are biased towards zero. But did that improve things? In fact, the basic 10th degree polynomial model from polyfit actually had a very good estimate of the constant term. It should have been 1, in a Taylor series approximation.
However, the regularized model turned 0.99853 into 0.91365. A simple truncated Taylor series would have had:
syms X
vpa(taylor(cos(X),X,'order',10),10)
ans =
0.0000248015873*X^8 - 0.001388888889*X^6 + 0.04166666667*X^4 - 0.5*X^2 + 1.0
So polyfit actually did a decent job on the lowest order coefficients.
plot(x,y,'Bo',x,polyval(P10,x),'g-',x,polyval(RP10,x),'r-')
legend('data','polyfit','regpolyfit')
So the regularized polynomial model has its coefficients biased towards zero. But I actually strongly prefer the unregiularized model, even in this case. I suppose I might have chosen a different value for lambda.
In the end, I need to question if you really wanted to fit a high order polynomial model.
A smoothing spline might be a far better choice. Or use my SLM Toolbox, fitting a regression spline.
The idea is that you are using a high order polynomail model, because you don't know how to use anything better, but you have some data where a low order model does not give you an adequate fit. A spline model is a better choice, by far. Just throwing extra coefficients at the problem always leads to ill-posed problems, singular matrices, and in the end, to garbage coefficients. Then you decide to try regularization. A bad idea. You are solving this the wrong way. USE A SPLINE.
My SLM toolbox is available for download from the file exchange.

#### 0 件のコメント

サインイン to comment.

Hiro Yoshino 2019 年 11 月 14 日
First of all, fitlm is for "Linear regression".
But I'm sure you're doing actually Linear regression - you just do not understand the definition of "Linear model".
(When the model is linear with respect to your coefficients, then it is called "Linear regression".)
: Linear model
The estimator is given by , and this estimator is obtained with fitlm.
If you want to perform a minimum least square with regularization, try either of them:
L1 + L2 combination (elastic net): lasso
The degree of regularization of L1 is stronger than that of L2; you'll get a sparse estimate. Therefore important predictors are to be automatically selected.
The estimator for L2 is given by
, where you need to specify the regularization parameter ξ, and this is also the case with L1.

#### 0 件のコメント

サインイン to comment.

Carlos Amador 2019 年 11 月 14 日
Hello Hiro, thanks for coming by. i know what a linear model is , thats the reason why i used the word polynomial for the other example. But i'm not using either a linear model or a polynomial model.
Not sure you actually understood my question. I'm using fitnlm and not fitlm and I'm not doing linear regression but non-linear regression ?? (But I'm sure you're doing actually Linear regression ...???)
i would like to have a regularization method that directly works with my mechanistic "non-linear model".

#### 1 件のコメント

Hiro Yoshino 2019 年 11 月 14 日
I'm sorry - I thought you were using 'fitlm'. I took it wrong.
But please note that Linear model is not always polynomials.
is also a Linear model. I thought your problem is linear since I thought you were using fitlm.
Non-linear + Regularization, that's what you want. OK.
Personally I've never heard of such things before; if you know please let me know if it's possible to do.
This may help you...hope it works for you:
Also, looks like there is regularization models for GLM available in MATLAB:

サインイン to comment.

サインイン してこの質問に回答します。

Translated by