Nonlinear fit to 100 data sets with seven unknown

2 ビュー (過去 30 日間)
Deniz Toprak
Deniz Toprak 2020 年 7 月 17 日
コメント済み: Star Strider 2020 年 10 月 4 日
Hello all,
I have a problem with lsqcurvefit optimization.Here is my problem,
I have been working on project which non-linear equation fitting with lsqcurvefit. I need to fit nonlinear equations to several data sets. I want to fit 7 unknown with using a least-squares method (lsqcurvefit) . The code has worked but the results of seven unkonowns are not good. Standart error is very big and resnorm of optimization is 0.5212 . The MATLAB find local minimum points.
How can I solve non-liner curve fitting problem ?I want resnorm to be 10^-13.
Could anyone help me about this problem?
Thank you for alll,

回答 (2 件)

Star Strider
Star Strider 2020 年 7 月 17 日
The residual norm you want may be unrealistic. I would use the ga (genetic algorithm) function (perhaps in a loop to run it several times) in order to get the best parameter estimates. See: How to save data from Genetic Algorithm in case MATLAB crashes? for a way of storing the best results from each ga call for later analysis.
  17 件のコメント
Deniz Toprak
Deniz Toprak 2020 年 10 月 4 日
Hi Star Strider,
It has been a while thank you for your answers. I have been working on my code but I can't get the best results from optimization. As a said before I have 7 variable which wanted to solve and I have 2500-10 000 data set to find these variable. My code first solve non-linear equations with fsolve then it uses data set to find these parameter. In order to determine the wrong part of the code , I decided to use 4 variable with 2500 data set but I didn't get the true parameter. Can you suggest me a different method ?
Star Strider
Star Strider 2020 年 10 月 4 日
My pleasure!
I do not remember what we were doing with this. It has actually been months since this began.
If you are having problems getting appropriate parameter estimates, I would use the ga (genetic algorithm) function. It is usually better at finding appropriate parameters since it does not use gradient-descent algorithms.
However your problem must always be well-posed, and the function you are fitting must be a realistic approximation of the process or function that describes the data. The ga documentation is straightforward, and there are a number of other examples using ga on MATLAB Answers (some of which I wrote), that will help you understand how to use it most effectively.

サインインしてコメントする。


John D'Errico
John D'Errico 2020 年 7 月 26 日
編集済み: John D'Errico 2020 年 7 月 26 日
That you WANT the residual norm to be low is not really that relevant. Pick anything you may choose to want. For example, suppose you WANT to win the lottery. Is that relavant to winning the lottery? Not really. I suppose if you buy enough lottery tickets, it might be. :)
Not all models can be used to fit any set of data you would choose to fit them to. If that were true, then the art of mathematical modeling would be truly easy to practice.
The residual norm that you hope to see is extemely tiny, compared to that which you got. That probably implies that if it is possible to achieve the desired residual norm, the data was actually constructed to have no error. If this is real measured data, then such a residual norm improvement would seem to be highly unlikely.
There are several common reasons why such a modelling erffort fails - either the model is simply inappropriate to the data under study, or the starting values were poor, or the model was simply coded improperly. I cannot know which of these cases is true, or if it is just that you misunderstand what resnorm as it is returned means.
My guess is it may be most likely you misunderstand the resnorm parameter. I think you may be thinking of that as a convergence tolerance, almost a tolerance that you can apply. That is not what it means. The residual norm would be simply the norm of the errors for the model, as compared to the target values you provide.
In your duplicate question, you state that you used multi-start with lsqcurvefit, but you actually got poorer results than if you gave it your own starting values. That may simply mean your starting values were better than those chosen randomly using multi-start. multi-start has no intelligence, no understanding about your model. And a high dimensional space is a big place to search.
This is not a problem with lsqcurvefit, but most likely a poorly fitting model for this data, and perhaps a misunderstanding of what the returns mean.

カテゴリ

Help Center および File ExchangeGenetic Algorithm についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by