How to model a correcting function for shifted data?

Hi all,
I have sensor reading that incorporates some time delays in its measurements, so that the measurement values of it are shifted from its ideal values (illustrated on the figure below). I have a set of data from both the ideal and measured. But, later I want to use a model that can 'correct' my measurement to 'estimate' the ideal value of it using only a single point of data (from one particular time step).
What technique should I use? All helps will be very much appreciated!
Thanks,
Ghazi

回答 (0 件)

カテゴリ

ヘルプ センター および File ExchangeStatistics and Machine Learning Toolbox についてさらに検索

質問済み:

2016 年 2 月 26 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by