Hello Community Experts,
I'm seeking assistance with my E2E wireless communication simulation, which consists of three components: Baseband Tx (DBPSK modulation), RF Receiver, and Baseband Rx (Direct Conversion method). Here are the details:
Baseband Tx:
- Original bits: 208, padded with 10 zeros → Total Tx bits: 218
- Modulation: DBPSK
- Filter: RRX Tx filter, filter span = 10, output samples per symbol = 2
RF Receiver:
- Input: Use Unbuffer block to convert 436 samples (218*2) to 1 sample per frame.
- Output: Use Buffer block to convert 1 sample back to 436 samples (Output Buffer size = 436).
- Other configuration as below screenshot
Baseband Rx:
- AGC desired output: 0.5 W
- Filter: RRX Rx filter, filter span = 10, input samples per symbol = 2, Decimation Factor = 2
- Tail padding removal: 208 bits before demodulation
- Demodulation: DBPSK and convert bit stream back to string.
The issue is that the first frame (first 208 bits or possbily 218 bits) consistently outputs '0'. It seems there's an unintended delay in the model, but I can't pinpoint where.
Could anyone please suggest how to eliminate this delay? or is it possbile that is come from other cause?
Thank you!
Best Regards,
Fumihiko Sato