Why does hilbert tranformer give a phase-shifted but amplitude-reduced signal?
7 ビュー (過去 30 日間)
古いコメントを表示
I have performed hilbert transform by using a FIR hilbert filter rather than the matlab built-in hilbert function. An example code is as follows:
fx = 10;
t = linspace(0,1,512);
x = 0.5*cos(2*pi*fx*t); %sampled signal
h = firpm(26,[0.1 0.8],[1 1],'h'); % hilbert filter
xh = conv(x,h,'same'); %hilbert transform of x
figure;
plot(t, [x;xh]); legend('Original','Transformed');
The resulting figure is given as follows:
How to address the problem to obtain the result equivalent to that by using hilbert function ?
0 件のコメント
採用された回答
Honglei Chen
2017 年 11 月 13 日
The phase shift is expected so I assume you are talking about the amplitude? This is because the filter is not ideal. If you do a
fvtool(h)
You can see that the magnitude response is not unity when the frequency is close to DC or Nyquest. In your case, the signal frequency is 10 and the sampling rate is 512, so that is close to DC so you see the attenuation. if you change your sampling rate to, say, 128, then the result would be similar to what you get from hilbert.
HTH
10 件のコメント
叶
2022 年 12 月 12 日
Do you have any idea to solve this problem?I mean if signal amplitude decreased,how can I get the normal ampliude?what should I do ?
その他の回答 (0 件)
参考
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!