Minimization problem with constraint
古いコメントを表示
Hello, the problem is as follows: Minimize R Subject to: (x-a_i)^2+(y-b_i)^2 ≤ R^2
Am looking to find x, y and R, knowing that
- a_i and b_i are known values from (100*1 matrices) each.
- x and y are within min and max of a_i and b_i.
Is it possible to find a solution with the optimzation tool of MATLAB? If not, any suggestion for a solution ?
採用された回答
その他の回答 (2 件)
Bruno Luong
2018 年 9 月 13 日
編集済み: Bruno Luong
2018 年 9 月 13 日
Here is solution
R = -Inf
x = something between min(ai,bi),max(ai,bi)
y = something between min(ai,bi),max(ai,bi)
next
カテゴリ
ヘルプ センター および File Exchange で Choose a Solver についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!