Exclusively Utilizing NVIDIA GeForce RTX GPU for MATLAB UNet Model Training: Issue with GPU Selection

2 ビュー (過去 30 日間)
Hi,
I am using MATLAB to train a UNet model for semantic segmentation purposes on my desktop computer running Windows 11. My computer is equipped with a CPU, GPU 0 (Intel(R) UHD Graphics 770), and GPU 1 (NVIDIA GeForce RTX 3070).
My goal is to exclusively utilize GPU 1 (NVIDIA GeForce RTX 3070) for the training process. To ensure this, I have set the 'ExecutionEnvironment' option in my training parameters to 'gpu'.
However, during training, I've noticed that GPU 1's usage remains at 0%, while the CPU's usage is considerably high. Even when I use the delete(gcp('nocreate')) command in my code, GPU 0 (Intel(R) UHD Graphics 770) only exhibits minor activity at around 1% or 5% usage.
I'm seeking guidance on how to resolve this issue and ensure that my UNet model is trained exclusively using GPU 1. Is there a specific configuration or step that I might be missing? Your assistance in resolving this matter would be greatly appreciated.
  7 件のコメント
Gobert
Gobert 2023 年 9 月 5 日
I also tried that, and It did not work.
Sam Marshalik
Sam Marshalik 2023 年 9 月 5 日
@Gobert, let's confirm that you are able to use the GPU device in general. Can you try something like this:
a = rand(100);
aGPU = gpuArray(a);
fft(aGPU)
Does the above command run successfully? Can you try bumping up the size of 'a' and see if you can see your GPU doing something in Task Manager?

サインインしてコメントする。

回答 (0 件)

カテゴリ

Help Center および File ExchangeParallel and Cloud についてさらに検索

製品


リリース

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by