Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)

Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!

 採用された回答

Edric Ellis
Edric Ellis 2016 年 7 月 26 日

0 投票

In this case, you can use pagefun. For example:
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)

2 件のコメント

Brad Hesse
Brad Hesse 2016 年 7 月 27 日
Oh my god! I am speechless! My entire forward/backward propagation algorithm worked, on the very first try, after completely re-writing it to be vectorized for GPU execution. This is at least a 40-50 fold speed improvement (my original code obviously wasn't even very well optimized for CPU execution).
I cannot believe how fast this is.
Thank you so much for your help Edric. I had actually already tried using the pagefun function, but it failed and I assumed it didn't work with 3D matrices x 2D matrices.
Edric Ellis
Edric Ellis 2016 年 7 月 27 日
Glad it worked for you!

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by