GPU Memory for dlconv

1 view (last 30 days)
David Eriksson
David Eriksson on 5 Mar 2024
Answered: R on 14 Mar 2024
Hi, why does matlab need 1GB of GPU memory for training a network with a 0.128 GB activation output of dlconv with 0.01GB input? The number of learnable parameters are 16×5×5×3. I got this result from debugging the code with a break point at the dlconv-line. Before executing dlconv I noted the GPU memory. Then I stepped over (executed) the dlconv. I noted down GPU memory again. The difference in GPU memory is 1GB. Is there a way to calculate how much GPU memory matlab will use during training? Best, David

Answers (1)

R
R on 14 Mar 2024
Hi David,
The memory usage of MATLAB during training can be influenced by various factors, including the size of the input data, the number of learnable parameters, and the specific operations performed by the network. Other factors, such as model size, batch size and forward/backward pass memory also influence the GPU memory usage.
Every function is different in the amount of working memory it needs to run. There really isn't any way to estimate the memory requirements other than running the function and monitoring GPU memory in a separate process.
The following MATLAB Answer illustrates a crude estimate of GPU memory requirement for a deep learning AI model training:
Following this example might help you in estimating the memory for "dlconv"!

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!