フィルターのクリア

Effective way to estimate the big O runtime?

24 ビュー (過去 30 日間)
Joe
Joe 2013 年 5 月 5 日
In order to estimate the big O runtime of MATLAB's find function I have simply created a plot of the runtimes of the find function for arrays of various n lengths. To estimate the big O runtime I would simply compare the slope to various known big O runtimes. I have used this method for MATLAB's sort function and estimated its runtime to be O(n*log(n)). For the find function it is quite harder to estimate it using my method, so I was wondering if there was a more effective way to estimate it.

採用された回答

Jan
Jan 2013 年 5 月 6 日
編集済み: Jan 2013 年 5 月 6 日
It is not trivial to determine the O complexity by measuring the timings: On one hand e.g. a quadratic dependency from the input size can matter for sufficiently large data only. But sufficiently can mean dozens, millions or zillions of elements. The performance pf an algorithm can depend mainly quadratically on moderate size of the input, while it is O(4) for large data.
On the other hand, especially for zillions of elements, the exhausted RAM has an important influence due to disk swapping, and the performance does not longer depend mainly on the complexity of the program.
Another problem is multi-threading: E.g. SUM distribute the work to different threads, in the input vector has more than 88999 elements. This is massively slower on single core machines and faster on dual cores. A similar problem appears for different cache sizes: When the data is split into chunks, which match into the processor cache, the execution can much faster than accessing widely spread elements of an array, which must be read from the slower RAM.
If a compiled function uses the SSE units, it can matter if the data are aligned to a certain boundary in the RAM, e.g. if the data are stored at an even multiple of 128 bits. Then an algorithm might have in 50% of the cases a much slower performance, when the memory is allocated with a boundary of 64 bits only.
And finally the runtime can depend on the values also: It could matter for FIND, if none or all indices are replied.

その他の回答 (1 件)

Youssef  Khmou
Youssef Khmou 2013 年 5 月 6 日
編集済み: Youssef Khmou 2013 年 5 月 6 日
hi,
About the Computational complexity, that can be done with pencil and paper following some rules for example : https://sites.google.com/site/youssefkhmou/flops
but for "real time" complexity you have to use the function 'tic' 'toc' to see the elapsed time during the code execution, and you have to use it several times and take the average because everytime you run the code some Processes may be running and that will influence the elapsed time, to see for example the O(nlogn) try to use 'tic' and 'toc' for every case ( matrix (4x4) (16x16) (500x500),...) and plot the saved elapsed time in function of the size , you will deduce the the O
Note : there are other sophisticated ways to estimate the O .

カテゴリ

Help Center および File ExchangePerformance and Memory についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by