Optimizing MATLAB generated code for Nvidia Drive AGX

Hi,
Is it possible to optimsise matlab generated code to use maximum capacity of nvidia drive. I am generating a script to create an around view image using the gsml cameras in nvidia drive. But, using matlab generated cuda code dosent seem to help. I am able to get the output, but there is a 5 second delay between the camera input and output. Is there any way to reduce this delay and increase the processing speed?
Thank You

1 件のコメント

Hariprasad Ravishankar
Hariprasad Ravishankar 2022 年 7 月 18 日
Hi Abhijith,
Does the time (5 second delay between camera input and output) include the frame grab from the camera? For high-resolution images, the frame grab itself could be a source of the bottleneck. Is it possible to get the timing after frame grab?
Also consider turning on GPU Coder's Memory Manager through the GpuConfig.
cfg = coder.gpuConfig('dll');
cfg.GpuConfig.EnableMemoryManager = true;
See the link below for more details:
Hari

サインインしてコメントする。

回答 (0 件)

カテゴリ

ヘルプ センター および File ExchangeGet Started with GPU Coder についてさらに検索

製品

リリース

R2021b

質問済み:

2022 年 6 月 30 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by