フィルターのクリア

working with 1 TB or greater data files

6 ビュー (過去 30 日間)
James Buxton
James Buxton 2014 年 11 月 7 日
コメント済み: James Buxton 2014 年 12 月 5 日
I would like to know what is the best way to work with very large data files that range from 1 TB to 30 TB. The files include in-phase and quadrature (IQ) data captured from a real-time signal analyzer. I am running Matlab 2014b on a Windows 8 64-bit computer with 64 GB of RAM.
I would like to be able to read in the data files to conduct basic signal processing and analysis such as FFTs and advanced routines more specific to RF analysis such as error vector magnitude, adjacent channel power ration, etc.
I am not familiar with Matlab's parallel computing capabilities or other 'bid data' capabilities such as mapreduce, memmapfile, or datastore.
Any information, feedback or suggestions as to recommended practices would be most welcome.
thanks, JimB
  3 件のコメント
yashwanth annapureddy
yashwanth annapureddy 2014 年 11 月 19 日
Yes, it would be good to know what type of files you are dealing with. datastore and mapreduce work with tabular text and mat files of a specific format.
Please do refer to the documentation of datastore and mapreduce and let us know for any questions using them.
James Buxton
James Buxton 2014 年 12 月 5 日
the file is a binary file. I can easily read data from a small file using 'fread'.

サインインしてコメントする。

回答 (1 件)

Darek
Darek 2014 年 11 月 14 日
Don't use Matlab. It's a waste of your time. Use AWS Kinesis with Redshift.

カテゴリ

Help Center および File ExchangeLarge Files and Big Data についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by