Cross-Correlation to determine lag

I have to vectors that contain the time points a signal was recorded (in seconds). The first vector contains the time points a playback was triggered (around every 12s). The other vector contains the time points I received an answer to this playback. I'm trying to calculate the lag of this answer relative to the onset of this playback. I was trying with the xcorr function, but can't make much sense of it. Any advice how to go about this? And is it problematic that the second vector is shorter than the first (I didn't always receive an answer to the playback)?

6 件のコメント

Image Analyst
Image Analyst 2016 年 7 月 7 日
Of course it would help people to help you if you had attached your two signals.
Zenid
Zenid 2016 年 7 月 7 日
Thanks, I attached them now.
José-Luis
José-Luis 2016 年 7 月 7 日
I am afraid those signals do not help to solve your question unless they have a timestamp.
Zenid
Zenid 2016 年 7 月 7 日
I was probably a little unclear and "signal" might be the wrong word. The data in the files are the timestamps of the signal onset. I'm interested in the lag of the second variable relative to the first.
José-Luis
José-Luis 2016 年 7 月 7 日
So in some cases the second signal comes before the first signal?
Zenid
Zenid 2016 年 7 月 7 日
well, the first one is basically repeated every 12s(±2s) or every 6s (±1s). And in between the second one sometimes answers the first one. And I'm interested in the latency of that answer relative to the onset of the first one.

サインインしてコメントする。

回答 (0 件)

質問済み:

2016 年 7 月 7 日

コメント済み:

2016 年 7 月 7 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by