Large data file I/O
Show older comments
I am trying to speed up a bottle neck in our code. Currently, the output of one of our FORTRAN modules writes the data to a text file. The text file is saved as .m file. The m files is then loaded into MATLAB. An example would be
function x = my_data(x)
x.time = [LARGE AMOUNT OF DATA]
Now this read operation seems to cause MATLAB to run out of memory. The MATLAB help file suggests storing large amounts of data in MAT files because they are optimized for read/write operations and compress data as well. They say this is better than using low-level file I/O such as fopen. But since our data is being read as an “.m” is it still using such file I/O. My question is should we take the time to try to write that data from FORTRAN as a mat file instead of a “.m”.
Accepted Answer
More Answers (0)
Categories
Find more on Fortran with MATLAB in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!