I am attempting to get parallel computing enabled when I train my RL agent in R2022a. Forgive the basic question regarding parallel computing,as this is my first attempt. My laptop is compatible with a HVIDIA GeForce RTX 3060. I seem to get an issue where the pool goes idle "IdleTimeout", and I had to "restart" pool on several occasions. I left it to run overnight and again it stalled and stopped training. I am not sure what is happening. Any help would be great. I have included some screen grabs below and the code in my RL script, GPU device properties, and the error showing the pool stopped. I did have episode manager open during the simulation, but the learning did not seem the sae as when I originally ran the training on the CPU. Plus in R2022a I am unable to stop training via episode manager.
Thanks in advance,
trainingOpts.UseParallel = true;
trainingOpts.ParallelizationOptions.Mode = 'async';
trainingOpts.ParallelizationOptions.StepsUntilDataIsSent = 32;
trainingOpts.ParallelizationOptions.DataToSendFromWorkers = 'Experiences';
The "IdleTimeout" error is shown below: