Practical Methods of Optimization
古いコメントを表示
Could anyone share their MATLAB codes or best practices for these specific methods? I am particularly interested in how you handle the "narrow valley" convergence issue in the Coordinate Search method.
Codes for metods of optimisation
One-Dimensional Methods
- Fibonacci Search Method
- Golden Section Search
- Dichotomous Search
- Newton’s Method
- Secant Method
- Quadratic Interpolation Method
Multi-Dimensional Methods
- Univariate Method
- Hooke-Jeeves Pattern Search
- Nelder-Mead Simplex Method
- Rosenbrock Method
- Powell’s Conjugate Direction Method
Gradient-Based Methods
One-Dimensional Gradient:
- Steepest Descent Line Search
- Newton-Raphson Method
Multi-Dimensional Gradient:
- Steepest Descent Method
- Fletcher-Reeves Conjugate Gradient Method
- Polak-Ribière Conjugate Gradient Method
- Newton’s Method in Optimization
- Davidon-Fletcher-Powell Method
- Broyden-Fletcher-Goldfarb-Shanno Method
- Sequential Quadratic Programming
10 件のコメント
Mark
約3時間 前
Star Strider
約2時間 前
Gradinent descent methods are extremely sensitive to the initial parameter estimate selection, and can get trapped in local minima. The more robust global methods search the entire parameter space for the best options. The gradient search methods can then 'fine tune' the initial results.
Mark
約3時間 前
A large number of freely available codes on optimization are listed here:
Or you could visit File Exchange:
Mark
約9時間 前
回答 (0 件)
カテゴリ
ヘルプ センター および File Exchange で Global or Multiple Starting Point Search についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!