I have deployed my code that controls and reads a DAQ device on a desktop computer. Actually the program is supposed to run for a long time (many days), however I need to record the data with a sampling rate of 20kHz. Therefore I analyze the signal every few seconds and store only the relevant information on the harddisc.
Unfortunately I constantly reach the memory limit of the desktop computer. Now I am looking for lines in the code where more memory is constantly used. For this I use:
profile -memory on
I have implemented this in a version of my program. When evaluating the report I came across some lines that I do not understand. Here is a selection:
- I have a control loop running
this loop is called very often, however the pause() function creates 242mb/266mb/7.65mb memory for it. Do I understand correctly that the function allocates 242mb and frees 266mb in the course? How can this be?
- During my measurement I read the data with 4Hz from the hardware buffer and write it to a .bin file. As soon as a data set is complete, after 100s, I read it out and analyze the signal. Thereby I load a dataset data(2000000,6) with values type single. After the analysis the variable out(80,2*n) remain, with n is the number of analysis cycles done. I then clear the large data variable and the .bin file and rewrite again. So in the peak I would expect the final size of (2000000,6). However, it says here:
4.509 108 2.6g/15.6m/39.3m 82 fileID = fopen(par.tempfile);
The line was called here every 100s. In this case 108 times. But 2.6g were allocated. This seems to be too much. How is this possible?