Deep Learning using GPU

6 ビュー (過去 30 日間)
Sri Adiyanti
Sri Adiyanti 2021 年 5 月 18 日
回答済み: Jayanti 2025 年 2 月 12 日 6:44
Hi there,
I want to run my deep learning model (m script) for aesthetic designs (thousands of images of trochloids) using Quadro RTX 4000 GPU; i9 with 10 cores.
Any suggestion so I can run it efficiently? do I need to modify my m script?
Thanks,
S

回答 (1 件)

Jayanti
Jayanti 2025 年 2 月 12 日 6:44
Hi Sri,
If you want to run a deep learning model using a GPU in MATLAB, the first step is to verify whether GPU is supported. You can quickly check this by visiting the following link:
If GPU is supported and you have the Parallel Computing Toolbox, MATLAB should automatically utilize the GPU for computations.
However, if GPU isn't supported, or if you are working with custom training loops, you will need to take an extra step. In this case, convert the data into gpuarray and ensure that you are using functions that support gpuarray.
Please refer to the below link to refer to the list of GPU supported functions:
For more detailed guidance on running deep learning models on a GPU, you might find the below documentation helpful:
Hope this will be helpful!

カテゴリ

Help Center および File ExchangeParallel and Cloud についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by