VHTMIMOPacketErrorRateExample - Why is LDPC on IEEE 802.11ac worst than BCC ??
5 ビュー (過去 30 日間)
古いコメントを表示
Hi All, I ran the WLAN System Toolbox example VHTMIMOPacketErrorRateExample, which basically provide a PER simulation of IEEE 802.11ac on a 8x8 MIMO configuration based on BCC as channel coding.
Now, I tried to use LDPC instead of BCC, and I was expecting it to perform a bit better (it does on IEEE 802.11n). But the output is totally worse...
Would anybody know where this poor performance could come from? I used the based TGacChannel and did not change anything from the example beside the channel coding.
Thanks in advance,
BR,
Jérôme
0 件のコメント
回答 (2 件)
BABA CHOUDHURY
2019 年 1 月 2 日
Hi Jerome, I know its very late now to respond to your query.
Still, I was also running similar simulations and found LDPC to outperfrom BCC in every scenario. Maybe some other parameters is affecting the calculations.
0 件のコメント
Darcy Poulin
2020 年 8 月 12 日
I had exactly the same issue, and spoke with Mathworks support.
It turns out that you should configure the LDPC decoding method to use 'norm-min-sum' rather than the default 'bp' algorithm. When I made this change, I saw the predicted improvement in link performance.
For 11ac, you configure it like this:
rxPSDU=wlanVHTDataRecover(vhtdata,chanEst,nVarVHT,cfgVHT, 'LDPCDecodingMethod','norm-min-sum')
The same thing occurs in 11ax. Here you configure it like this:
rxPSDU = wlanHEDataBitRecover(eqDataSym,nVarEst,csi,cfgHE,'LDPCDecodingMethod','norm-min-sum');
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で AI for Wireless についてさらに検索
製品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!