Reading large excel file using "read" command takes ~5 minutes, is this expected performance?
1 view (last 30 days)
Show older comments
I am reading a simulation/test output data in a .xlsx file into a matlab table through a datastore variable. The test data contains 450+ variables, each with 20000+ samples (i.e.) 450+ columns and 20000+ rows but all are numbers. I created a datastore on the excel file, modified the selected variables and variable type properties and used read command to read the file into a matlab table, it took about ~5 minutes. When I tried readtable command on the excel file directly, it took about the same time as well. However when I tried reading the file interactively using matlab export dialog, it took less than 30 seconds, so I am wondering if there's any way to achieve the same level of efficiency programmatically?
0 Comments
Accepted Answer
J. Alex Lee
on 6 Sep 2020
Try manually creating the import options with spreadsheetimportoptions().
2 Comments
J. Alex Lee
on 7 Sep 2020
Yes, the idea is to fully specify the import parameters so that they don't have to be auto-detected.
More Answers (0)
See Also
Categories
Find more on Spreadsheets in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!