フィルターのクリア

Info

この質問は閉じられています。 編集または回答するには再度開いてください。

What to do when you really ARE out of memory?

2 ビュー (過去 30 日間)
Robert Jenkins
Robert Jenkins 2015 年 8 月 4 日
閉鎖済み: MATLAB Answer Bot 2021 年 8 月 20 日
What is the solution for optimizing code when you really are just working with too large of a dataset?
Currently I need to perform triscatteredinterp using 3 vectors all (100,000,000 x 1).
scatteredInterpolant does not work any better in this instance.

回答 (3 件)

the cyclist
the cyclist 2015 年 8 月 4 日
編集済み: the cyclist 2015 年 8 月 4 日
For very large datasets, processing a random sample of the data will often give satisfactory results.

Walter Roberson
Walter Roberson 2015 年 8 月 4 日
Store the data in hierarchies such as octrees that allow you to extract a subset that fits within working memory to do the fine-grained work on.

Robert Jenkins
Robert Jenkins 2015 年 8 月 7 日
The solution I used in the end was to break up each of my vectors into manageable chunks and execute TriScatteredInterp on each of them.
The interpolant is valid on the new grid but only gives solutions inside of the block from which it came. I then simply stitched these back together.
Thank you both for your input!

この質問は閉じられています。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by