The optimization algorithms
バージョン 04.2023.01 (143 KB) 作成者:
Kenouche Samir
Algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering some objective functions (1D and 2D).
The optimization algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering objective functions in one and two dimensions. The Newton and quasi-Newton methods may encounter problems such as the Hessian is too complex or does not exist. The requirement to apply a matrix inversion at each iteration, this can be prohibitive for optimization problems involving many variables. These methods can therefore become impractical. An alternative is to use the family of gradient descent algorithms. These methods do not require explicit computation or Hessian approximation. A gradient descent algorithm is implemented by choosing successive descent directions and the amplitude of the descent step in the chosen direction. This family of algorithms is widely used in optimization processes of more or less complex problems. The term descent arises because these algorithms look for the extrema in an opposite direction to that of the objective function's gradient.
Explanatory algorithmic schemes are available in the user guide.
引用
Kenouche Samir (2025). The optimization algorithms (https://www.mathworks.com/matlabcentral/fileexchange/128008-the-optimization-algorithms), MATLAB Central File Exchange. に取得済み.
MATLAB リリースの互換性
作成:
R2023a
すべてのリリースと互換性あり
プラットフォームの互換性
Windows macOS Linuxタグ
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!optimization_algorithms
バージョン | 公開済み | リリース ノート | |
---|---|---|---|
04.2023.01 |