What functions are valid for optimization expressions?

1 回表示 (過去 30 日間)
Jake Smith
Jake Smith 2019 年 1 月 25 日
回答済み: John D'Errico 2019 年 1 月 25 日
I get an error where an optimization expression can't evaluate a logical NOT (~). I also tried using functions such as "all()" and "unique()" which also caused errors. Looking at the reference here:
https://www.mathworks.com/help/optim/ug/optim.problemdef.optimizationexpression.html
It appears that optimization expressions can only use very basic arithmatic functions like "sum()" or transposing matrices. What types of functions can be used in optimization expressions outside of these?

採用された回答

Alan Weiss
Alan Weiss 2019 年 1 月 25 日
I'm sorry that you couldn't find this information in the documentation. Supported Operations o Optimization Variables and Expressions
This link, of course, is to the latest software version. Check out the version for your software version if it is not current.
Alan Weiss
MATLAB mathematical toolbox documentation

その他の回答 (1 件)

John D'Errico
John D'Errico 2019 年 1 月 25 日
NO. It is absolutely, simply NOT true that functions can "only" use simple operations like sum or matrix ops.
At the same time almost all optimization tools will require a continuous and differentiable objective function. (There are some tools that can do more.)
Are logical operations continuous and differentiable? I.e, all, not, or any? Not so. In general, you can use that as the clue. If it is not differentiable, or even continuous, then you are barking up the wrong tree.
For example, will an objective function that involves abs be admissable? Since abs fails at only one point, your optimization may succeed. At least it may converge to something where the optimizer thinks it is happy. But at the same time, there is a reasonable chance that you may have ended up at a locally optimal solution that is not globally optimal. So I would suggest avoiding even something as moderately well-behaved as abs.
Some tools are somewhat more robust than others. For example, tools like fsolve will require the objective be differentiable, because it forms an internal estimate of the gradient using finite differences. As such, if your function is not well behaved, then expect garbage.
But how about fminsearch? fminsearch never differentiates the function. So SOMETIMES it may succeed where others fail. However fminsearch will still probably fail if your function is not well behaved.
In one dimension, fzero will be well behaved. It will be quite robust, because it uses a protected iteration scheme that can reduce to bisection when things are seen to be going poorly. Similarly, fminbnd tries to use a quadratic scheme when things go well, but when it goes poorly, fminbnd can reduce the scheme to a golden section search.
Other tools, such as GA, or optimization scheme like simulated annealing, PSO, etc., will be more robust to problems in the objective. Due to their design, they can overcome problems that will cause other tools to fail miserably.
However, NOTHING will absolutely assure success on an objective where the poser wants to make aspecific algorithm fail. The best solution is to learn how the different algorithms work. Understanding your tools is the best way to decide which tools will work best on any given problem. If you won't understand the tools in your toolkit, then expect randomly poor results.

カテゴリ

Help Center および File ExchangeGet Started with Optimization Toolbox についてさらに検索

製品


リリース

R2017b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by