How to force enable GPU usage in fitrgp

When i am using Regression learner app , and select 'Use Parallel' option for training, i can see my Nvidia GPU ( compute 7.2) being used.
But when i generate function from it and try to run from script, it wont, Can we set something in script to use GPU from script.
i tried Gpuarrays and tall array and both are not supported by fitrgp.
regressionGP = fitrgp(...
(X), ...
(Y), ...
'BasisFunction', 'constant', ...
'KernelFunction', 'exponential', ...
'Standardize', true,...
'OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct(...
'Verbose',1,...
'UseParallel',true));

3 Comments

Please correct your spelling and capitalization errors first. Too many typos will greatly discourage others from answering your question.
M N
M N on 8 Apr 2023
Edited: M N on 8 Apr 2023
is there somethings specific you are not able to understand?
In MATLAB Answers, each user can communicate in whatever language they feel most comfortable communicating in. If a reader has difficulty understanding, then the reader can ask for clarification of particular parts... or the reader can move on to other questions.
There is no requirement that people post in English -- and if they do post in English then it is fine if they used a machine translation that might get words or capitalization or contractions wrong compared to "perfect" English. We are here for Mathworks products, not for complaining about typographic mistakes.

Sign in to comment.

 Accepted Answer

Ive J
Ive J on 7 Apr 2023
fitrgp does not [yet] support GPU arrays. You can easily scroll down the doc page and check "Extended Capabilities" for each function. UseParallel as the name suggests, will invoke parallel computations.

4 Comments

M N
M N on 8 Apr 2023
Edited: M N on 8 Apr 2023
Thanks Ive, but my confusion is why app seems to use GPU even when GPU arrays and tall are not supported for fitrgp. is there something i am missing.
Also when i generate function from Regression learner app, that does not show setting "Use Parallel" in generated fucnction code for fitrgp.
When i run from matlab script, even when parallel pool is invoked, it will only use CPU and no GPU at all.
So i am thinking i am missing some setting in my script
No fitrgp does not even accept gpu arrays to begin with:
% X = randn(1e4, 5);
% y = randn(1e4, 1);
% mdl = fitrgp(gpuArray(X), gpuArray(y));
Error using RegressionGP.prepareData
The value of X must not be a gpuArray.
Error using gpuArray
Unable to find a supported GPU device. For more information on GPU support, see GPU Computing Requirements.
Also you may try, but Regression Learner doesn't even let you select gpyArrays as predictors or response. So, I'm not sure what you mean exactly by "app seems to use GPU". Of course, there are other functions implemented in the Regression Learner app that benefit from GPU computations, such as fitglm or fitlm, but you should use them in your own functions/scripts:
% X = randn(1e4, 5);
% y = randn(1e4, 1);
% mdl = fitlm(gpuArray(X), gpuArray(y));
Thanks for explaination, my bad, just in pursuit to get maximize usage of my GPU/CPU, i am thinkings stuff beyond documentation.
In screenhot below, when i run app, i see 7% GPU usage on my computer as soon as i hit train button on Regression Learner app.
So ledme to think that something in app or Matlab instance itself is trying to run on GPU itself, while running fitrgp. but i think its Regression learner's UI that is using the GPU not fitrgp, fitrgp seems to consume the CPU itself.
Is there a plan by Mathworks to add gpuarray support for fitrgp?
For that you need directly to contact TMW :-)

Sign in to comment.

More Answers (0)

Products

Release

R2021b

Asked:

M N
on 7 Apr 2023

Commented:

on 9 Apr 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!