How to use gpu for deep learning
8 views (last 30 days)
Show older comments
I’m training detection model yolov4 on matlab. I just got a computer with a graphic card, the Nvidia GeForce RTX 3070 Ti. I want to get the maximum from it. Please help me, what I need to write in matlab code to perform training using GPU.
0 Comments
Answers (1)
KSSV
on 1 Oct 2022
Example:
options = trainingOptions('sgdm', ...
'Momentum',0.9, ...
'InitialLearnRate',initLearningRate, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',learningDropPeriod, ...
'LearnRateDropFactor',learningRateFactor, ...
'L2Regularization',l2reg, ...
'ExecutionEnvironment', 'auto',....
'ValidationPatience',Inf,...
'MaxEpochs',maxEpochs, ...
'ValidationData',{inputVal, targetVal}, ...
'ValidationFrequency',50,...
'shuffle','every-epoch',....
'MiniBatchSize',miniBatchSize, ...
'GradientThresholdMethod','l2norm', ...
'GradientThreshold',0.01, ...
'Plots','training-progress', ...
'ExecutionEnvironment', 'auto',..... %<------ check this. Keep it auto so MATLAB can pick the best
'ValidationPatience', 10, ...
'Verbose',true);
0 Comments
See Also
Categories
Find more on GPU Computing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!