- It might help to split the file into smaller files before importing to Matlab. If on Windows, see GSplit is a free reliable file splitter that lets you split your large files
- How much RAM is in your system.
- What OS?
- Upload a small piece of the file to your question
Loading very large CSV files (~20GB)
34 ビュー (過去 30 日間)
古いコメントを表示
Sadjad Fakouri Baygi
2016 年 3 月 17 日
コメント済み: Sadjad Fakouri Baygi
2016 年 3 月 18 日
I have some CSV files that I need to import into the MATLAB (preferably in a .mat format). I already tried to break down these files using csvread function into 100 cluster pieces, but it's very slow and it doesn't go further than 60 steps even though my computer machine is fairly new. I need to extract only numeric values and this values are separated by comma. I will appreciate it if you can help me to get through this.
Thanks,
Sajad
2 件のコメント
per isakson
2016 年 3 月 17 日
採用された回答
Robert
2016 年 3 月 17 日
You should look into datastore and mapreduce. They were introduced in R2014b and are intended for handling large data sets. The datastore object allows you to read the data in chunks, skip columns, and store the results in a table. It's behavior is somewhat similar to fread or fscanf with a size input; however, the datastore's use of a table allows you to assign different data types to each column. I use this for data that includes Boolean status bits along with doubles so that I don't have to store the Booleans as doubles in my data array.
mapreduce is specifically designed for operating on data sets that don't fit in memory. Rather than attempt to explain it I will simply suggest you check out the documentation.
docsearch Getting started with mapreduce
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で MapReduce についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!