Huge Data file Import Export

2 views (last 30 days)
Saurav Agarwal
Saurav Agarwal on 6 Aug 2013
I have a huge data file as .txt. Each element is separated by a ; and each row by a new line character. The data has been generated by a c++ program. Now, the no. of columns in each row is not constant. I need to read each row in MATLAB. I converted the file to .xlsx and used the following code:
A=xlsread('file.xlsx','Sheet1','2:2');
The command takes a lot of time to execute. The data is 30,000 X 500 approx.
1. How do I read the data in a faster way? I tried using csvread but could not implement it.
2. How do I implement the xlsread code in a loop? I need it to run from '2:2' to '30000:30000'. If I import the whole data, it causes memory failure.
3. In what format should I generate data from c++ so that it is fast to import in excel?
  1 Comment
Cedric
Cedric on 6 Aug 2013
Edited: Cedric on 6 Aug 2013
A 3E4x3E4 matrix size in memory is 7.2GB if stored as double.
These huge files should be stored/managed using a binary format (your own, HDF5, netCDF, etc) if you wanted to be efficient.

Sign in to comment.

Answers (0)

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!