Randomness is lost in parfor loop on GPU?
1 ビュー (過去 30 日間)
I have a code with a following structure:
myoutputs1 = my_function1(myinputs1);
seed = #;
parfor iter = 1:50
myouputs2 = my_gpu_function2(myoutputs1, my_function_of_Gaussian);
result(iter).name = myoutputs2;
What I want to do is to model some random behavior of something by using this code repeatedly. What I'm doing is to copy this code and change the value of the seed. For example, the "my_code_seed1.m" uses "seed = 1", and "my_code_seed2.m" uses "seed = 2". I run these codes in a parallel way on a GPU cluster to save the computation time. However, it seems like the results from the codes are very similar, even though I'm giving different values of seed.
Do you have an idea why rng(seed); is not working properly in this case? I appreciate any help.
回答 (1 件)
Steven Lord 2022 年 4 月 13 日
What I'm doing is to copy this code and change the value of the seed. For example, the "my_code_seed1.m" uses "seed = 1", and "my_code_seed2.m" uses "seed = 2".
So if later on you find a bug in my_code_seed42.m you're going to go back and modify the previous 41 files? That's inefficient. Instead write the code once as a function that accepts a seed value and call that function with the various seeds as input.
As for the randomness in a parfor, see this documentation page on repeating random numbers in a parfor loop and the page on controlling randomness to which its first paragraph links.