Peformance issue while fetching data from C Mex function.
2 ビュー (過去 30 日間)
古いコメントを表示
Hi,
We have C mex funciton that is implemented to fetch data from external source over network and fill this data in C mex function(using mxCreateDoubleMatrix and copy this data into plhs array), when this data is huge(around 22 MB), accessing this data on matlab is very slow takes while(20 seconds), if we chunk this data(in 100Kbs), data is copied and accessed on matlab variable faster. I wanted to know, is there memory limit while allocating and copying data and access into matlab variable. Appreciate your suggestions.
Thank you'
2 件のコメント
Jan
2022 年 6 月 8 日
Without seeing the code it is impossible to guess, if it contains an avoidable bottleneck. 22MB is not considered as "huge" usually. "100Kbs" is a strange unit and it is not clear, how this could define a chunk.
The memory limit is defined in Matlab's preferences => WorkSpace => Array size limit
回答 (0 件)
参考
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!