Multivariate nonlinear regression model fitting

I apologize since I am new to matlab
I have built a multivariate model to describe experimental data and I am trying to set up a nonlinear regression fitting to extract parameters for the model.
The model has two dependent variables that depend nonlinearly on two independent variables The model has three parameters.
I found the mvregress function, but as I understand it, it is a multivariate linear regression, which does not apply to my problem.
Thank you in advance for any help

 採用された回答

Anton Semechko
Anton Semechko 2018 年 7 月 6 日
編集済み: Anton Semechko 2018 年 7 月 6 日

1 投票

If the function you are trying to fit is linear in terms of model parameters, you can estimate these parameters using linear least squares ( 'lsqlin' documentation). If there is a nonlinear relashionship between model parameters and the function, use nonlinear least squares ( 'lsqnonlin' documentation). For example, F(x,y,c1,c2,c3)=c1*x^2 + c2*exp(y) + c3*cos(x-y), is nonlinear in terms of (x,y), but is a linear function of (c1,c2,c3) (i.e., model parameters).

6 件のコメント

Jorge
Jorge 2018 年 7 月 6 日
Thank you very much. Can you clarify if it is possible to use lsqnonlin with more than one dependent variable? for example, two observed variables that depend on the predictors? I apologize if I am not posing the question in a clear way
Anton Semechko
Anton Semechko 2018 年 7 月 6 日
編集済み: Anton Semechko 2018 年 7 月 6 日
Yes, you can use 'lsqnonlin' to find parameters of functions with varying number of dependent and independent variables. This because cost-function being minimized when doing least-squares is the total sum of squares of the residuals. For example, suppose F(x,y,C)=[f1(x,y,C),f2(x,y,C)] is a vector-valued function of 2 variables (x,y) and a set of parameters (C). Suppose Y={[f1_i,f2_i]} is a set of N observations and X={[x_i,y_i]} is the corresponding set of independent variables. Square of the residual corresponding to the i-th observation is R2_i=(f1_i-f1(x_i,y_i,C))^2 + (f2_i-f2(x_i,y_i,C))^2. When doing least-squares, your goal is to find a set of parameters (C) that minimize the sum of R2_i over all N observations. I hope this example illustrates that Ri_2 can be computed for both scalar- and vector-valued functions.
Jorge
Jorge 2018 年 7 月 6 日
Thank you for the reply. That is very helpful.
I am concerned though that if the values of the two dependent variables are orders of magnitude different, the overall sum of squares will be dominated by the function that yields very large numbers. I could scale the outputs to be more similar, (a form of weighing) but is there a formal way to deal with this situation?
perhaps there is way to use weights in the actual fitting?
Thank you again! you have been very helpful!
Anton Semechko
Anton Semechko 2018 年 7 月 6 日
Returning to the previous example, suppose that v1 and v2 are variances of {f1_i} and {f2_i}. Define w=v1/v2. To ensure that f1 and f2 contribute equally to the sum of squares, compute residuals as R2_i=(f1_i-f1(x_i,y_i,C))^2 + w*(f2_i-f2(x_i,y_i,C))^2.
Jorge
Jorge 2018 年 7 月 6 日
I see, fantastic! Thank you!
If I can ask further, is there a simple way to obtain confidence intervals for the parameters? maybe using a bootstrap method? Thank you!
Anton Semechko
Anton Semechko 2018 年 7 月 6 日
編集済み: Anton Semechko 2018 年 7 月 6 日
Bootstraping is one option. Another option is to use jack-knife (i.e., leave-one-out cross-validation). Although if you have a large dataset, boostraping may be a more effective option (from computational perspective).

サインインしてコメントする。

その他の回答 (0 件)

製品

リリース

R2017a

質問済み:

2018 年 7 月 6 日

編集済み:

2018 年 7 月 6 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by