implementing a basic low pass Filter
6 ビュー (過去 30 日間)
古いコメントを表示
I have a basic question. I want to implement a simple Low Pass 1 order filter with cut off frequency = 100Hz. Then I want to use this filter on a time domain signal of 1 Hz. I am expecting the output to be same as the input. But, When I convolve the impulse response of the filter with this input signal, I am getting an unwanted gain . I have attached my code. Can you help me to do this .
FS = 1000;
T = (1/FS);
fc = 100;
r = 10 * 1000;
c = 1 / ( 2 * pi * r * fc) ;
num = [(1/(r*c))];
den = [1 (1/(r*c))];
sys_tf_model = tf(num,den);
time = (0:T:10);
signal = sin ( 2 * pi * 1 * time);
impulse_response = impulse(sys_tf_model,time);
output1 = T * conv(signal,impulse_response);
output2 = lsim(sys_tf_model,signal,time) ;
subplot(3,1,1);plot(time,output1(1:1:length(time)));title('Convolution');
subplot(3,1,2);plot(time,output2);title('Lsim output');
subplot(3,1,3);plot(time,signal);title('Input');
1 件のコメント
Sudarshan Kolar
2017 年 3 月 3 日
You’re approximating a continuous integral by a discrete sum. This introduces loss.
Replace the sampling period T by T/10 and you will see that the gap is reduced.
I would not recommend conv for your application.
回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Filter Design についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!