Calculating error between non-linear dataset and model
8 ビュー (過去 30 日間)
古いコメントを表示
My model is a rotated ellipse, for which I have already worked out parameters for the ellipse which best fits my data using the ellipse fit here: https://uk.mathworks.com/matlabcentral/fileexchange/3215-fit_ellipse .
I have a set of pseudo-elliptical data points. I would like to calculate the error between my data and the model; however, the data is not at regular intervals and I am unsure how to calculate the error if I do not have points which correspond. I suppose I should be able to calculate the y-values of the ellipse for a given x value of my data - how would I go about this?
Apologies if this is just maths but if anyone can help that would be great!
0 件のコメント
回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Linear Regression についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!