How to speed up large data calculations
9 ビュー (過去 30 日間)
古いコメントを表示
Hi, I'm working on a big array, data, up to N=10,000,000 points, as well as its time index, t. Then I am trying to form a new 2*N data, Intp, involving interpolation of the original array. I am using a Gaussian kernel to select the sub-array (only 100 to 200 data point) to perform the calculation, as the code shows. Since the data size is so large, it may take days to finish the calculation. It looks like a convolution, at the first look, but not. If the data is small, I may use meshgrid, etc. Right now I'm using for loop and in my Windows 7 system with Matlab R013a, 8gb memory computer, it takes 2.5 hours to calculate only 1M data point. I am wondering if I can use any vector or matrix to replace these codes, or using PCT, etc. My code is listed as below(here I use a sinuous signal to replace my raw data). The program needs too much computation time so I want to increase the speed. Anybody have a better idea to speed up the code? Thanks a lot for your help.
clear all N=1e5; f=500; t = linspace(0,1,N); NA=5e+05; data=sin(2*pi*f*t); k=1:2*N; Comp=0*k; Intp=0*k; for k=1:2*N, Comp=abs(t-(k-1)/(2*N)); idx = find(0.0025-abs(t-(k-1)/(2*N)) >= 0); Intp(k)=sum(data(idx).*exp(-0.00005*Comp(idx).^2)); end %figure(1) %plot(Intp)
0 件のコメント
回答 (1 件)
Sagar Zade
2019 年 12 月 12 日
Hi, Did you explore the following link - https://www.mathworks.com/help/matlab/large-files-and-big-data.html?s_tid=CRUX_lftnav
0 件のコメント
参考
カテゴリ
Help Center および File Exchange で Matrices and Arrays についてさらに検索
製品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!