Radar simulation: How to apply a precise delay?
2 ビュー (過去 30 日間)
古いコメントを表示
Hi all,
I am trying to create a simple radar baseband channel model. Therefore, I want to generate a delayed version of my transmit baseband signal:
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/208242/image.png)
The signals are sampled with sampling frequency
, so applying a delay by shifting the signal by a certain amount of samples gives a "delay resolution" of only
. As the location of targets might lead to delays that are no multiples of that value, this approach might introduce an error. Are there ways of applying a delay that are not limited by time discretization (except upsampling, which is already applied), or is this a limitation not to overcome? In particular:
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/208243/image.png)
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/208244/image.png)
- Is it possible to perform a frequency-dependent phase shift according to the shift theorem of the DFT instead of a time delay? And does this give a finer resolution?
- I saw that there is a FreeSpace object in the Phased Array Toolbox, that also adds a delay to a signal. Does anyone know how this is realized?
0 件のコメント
採用された回答
Honglei Chen
2019 年 3 月 15 日
The FreeSpace in Phased Array System Toolbox uses fractional delay fitler to approximate the delay between samples.
HTH
0 件のコメント
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Transmitters and Receivers についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!