Which function can I use instead of fsolve to solve a system of nonlinear equations with a global optimisation?

39 ビュー (過去 30 日間)
I am trying to solve 3 non linear equations and I cannot have good guess values for those equations. I read along different documents, that the solution my be to use global optimisation functions instead of local optimisation ones, as fsolve.
However, I am not very familiar with these functions (fminunc,patternsearch,ga ...)
Could anyone help me with that?
Thank you in advance!

採用された回答

Walter Roberson
Walter Roberson 2020 年 10 月 26 日
First of all, mathematical theory tells us that there exist functions where knowing the value of the function at one location gives you no information about the value of the function at any other location. It follows from this that there exist nonlinear equations which cannot be solved by any gradient descent, or genetic algorithm (that does not run all possibilities exhaustively), or newton's method, or simplex method, or anything else implemented by the functions you list -- not short of flailing around and happening to try an exact solution by chance.
Second of all: when you have "black box" functions (a function handle for code you are not permitted to examine analytically) then it can be quite difficult to find solutions of equations even when clear solutions exist analytically. Therefore none of the functions you list can guarantee solutions to global minima or to nonlinear equations -- not outside of certain narrow classes of functions.
Third: Every equation f(x) == b has an equivalent minimization problem norm(f(x)-b) with the equation being solved at the locations where that norm is 0, and the equation being hopefully solved to within accuracy and round-off error where the norm is minimized if no exact zero can be found. For real-valued functions, the norm squared can be substituted, (f(x)-b).^2 .
Fourth: for multiple real-valued equations, then you can minimize (f1(x)-b1).^2 + (f2(x)-b2).^2 etc . If there is a zero, then all of the equations are solved simultaneously. (For practical reasons, you might need to scale the function values relative to each other to avoid having some functions effectively ignored.)
Fifth: for non-linear functions it is really common for the solution space to be so bumpy that all of the fmincon / fminunc algorithms get stuck in local minima, sometimes "near" a solution but sometimes quite far from a solution. fminsearch() tends to get stuck less easily, so it tends to get closer to solutions given enough time, but there are non-linear functions that it will go indefinitely wrong on. In particular if there is an asymptope minimum towards infinity but a deeper minimum somewhere else, then it is common for fminsearch to get stuck searching the asymptope.
There are no algorithms that can guarantee finding solutions (or global minimum) outside of some narrow cases.
You can do a bit better if you have the symbolic toolbox and your equations can be differentiated.
  2 件のコメント
Saunok Chakrabarty
Saunok Chakrabarty 2023 年 11 月 5 日
編集済み: Saunok Chakrabarty 2023 年 11 月 5 日
I am facing a similar issue. My first-order optimality is stuck in the range of (2.75e+03, 2.98e+03). Far from zero. Any suggestions on how to improve this?
John D'Errico
John D'Errico 2023 年 11 月 5 日
Sorry. Walter said it all.
There is no assurance your problem has a solution at all. If it does, then you ABSOLUTELY NEED to choose better starting values. And no, there is no magic scheme to find better starting values.
Global techniques are not an assurance of finding a solution, since they are really only global heuristics that try to avoid the issue of poor starting values. They might help, or maybe not. It is worth a try though.

サインインしてコメントする。

その他の回答 (0 件)

タグ

製品


リリース

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by