What determines the time axis in spectrogram using normalized frequencies?
7 ビュー (過去 30 日間)
古いコメントを表示
I'm attempting to make a spectrogram using normalized frequencies, and the resulting spectrogram looks as expected with the exception of the time axis. Specifically, while the true duration of the data is ~8 minutes, when calculating the spectrogram the time axis ranges to ~21.5 hours!
I've tried manipulating each of the input arguments, and the only thing that seems to affect the time axis is the length of the input data vector - which obviously doesn't make the time axis any more accurate. Any help would be greatly appreciated.
PS: I calculate the normalized frequencies thusly:
FreqsInHz = [0.5:0.1:50]; %The range of frequencies I'm interested in.
SamplingRate = 1000Hz;
normFreqs = (2*pi).*FreqsInHz./SamplingRate; Normalized frequencies in rad/sample.
0 件のコメント
回答 (1 件)
Sean de Wolski
2016 年 8 月 1 日
Time is a function of the sampling frequency, fs.
Compare
spectrogram(rand(1,100000),64,0,64,8000)
spectrogram(rand(1,100000),64,0,64,44100)
For how to use Fs
>> doc spectrogram
参考
カテゴリ
Help Center および File Exchange で Time-Frequency Analysis についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!