How to avoid memory problem while processing huge table?

1 回表示 (過去 30 日間)
Nitinkumar Ambekar
Nitinkumar Ambekar 2016 年 8 月 31 日
I have a huge observation table with around 30 Lacs of rows and 12 columns. While training knn classifier in 2016a version, I am getting errors related to memory. Is there any way to avoid this? I have tried to reduce rows but it's affecting the output quality.
Each row in table is a pixel and it's other values as features in columns. In one set of MRI scan, there are around 20 images of 512x512, I am loading one set for creating observation table. Is there another way to pass large amount of data to knn classifier?

回答 (1 件)

KSSV
KSSV 2016 年 8 月 31 日
doc datastore, memmap, mapreduce.
  1 件のコメント
Nitinkumar Ambekar
Nitinkumar Ambekar 2016 年 9 月 1 日
Thanks @Dr. Siva, one small query: Can I pass one of these to a function which takes `table` or `matrix`?

サインインしてコメントする。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by