Reading large excel file using "read" command takes ~5 minutes, is this expected performance?

3 ビュー (過去 30 日間)
I am reading a simulation/test output data in a .xlsx file into a matlab table through a datastore variable. The test data contains 450+ variables, each with 20000+ samples (i.e.) 450+ columns and 20000+ rows but all are numbers. I created a datastore on the excel file, modified the selected variables and variable type properties and used read command to read the file into a matlab table, it took about ~5 minutes. When I tried readtable command on the excel file directly, it took about the same time as well. However when I tried reading the file interactively using matlab export dialog, it took less than 30 seconds, so I am wondering if there's any way to achieve the same level of efficiency programmatically?

採用された回答

J. Alex Lee
J. Alex Lee 2020 年 9 月 6 日
Try manually creating the import options with spreadsheetimportoptions().
  2 件のコメント
Ajay Kumar
Ajay Kumar 2020 年 9 月 7 日
Thanks. I will read up on this function and try it out. The format of the test output sheet will not change that often, but If it does, then I will have to update this object that I am gonna create?
J. Alex Lee
J. Alex Lee 2020 年 9 月 7 日
Yes, the idea is to fully specify the import parameters so that they don't have to be auto-detected.

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeSpreadsheets についてさらに検索

製品


リリース

R2017b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by