How to force enable GPU usage in fitrgp
8 ビュー (過去 30 日間)
古いコメントを表示
When i am using Regression learner app , and select 'Use Parallel' option for training, i can see my Nvidia GPU ( compute 7.2) being used.
But when i generate function from it and try to run from script, it wont, Can we set something in script to use GPU from script.
i tried Gpuarrays and tall array and both are not supported by fitrgp.
regressionGP = fitrgp(...
(X), ...
(Y), ...
'BasisFunction', 'constant', ...
'KernelFunction', 'exponential', ...
'Standardize', true,...
'OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct(...
'Verbose',1,...
'UseParallel',true));
3 件のコメント
Walter Roberson
2023 年 4 月 8 日
In MATLAB Answers, each user can communicate in whatever language they feel most comfortable communicating in. If a reader has difficulty understanding, then the reader can ask for clarification of particular parts... or the reader can move on to other questions.
There is no requirement that people post in English -- and if they do post in English then it is fine if they used a machine translation that might get words or capitalization or contractions wrong compared to "perfect" English. We are here for Mathworks products, not for complaining about typographic mistakes.
採用された回答
Ive J
2023 年 4 月 7 日
fitrgp does not [yet] support GPU arrays. You can easily scroll down the doc page and check "Extended Capabilities" for each function. UseParallel as the name suggests, will invoke parallel computations.
4 件のコメント
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Gaussian Process Regression についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!