Reinforcement learning agent saving error

5 ビュー (過去 30 日間)
Yihao
Yihao 2024 年 2 月 24 日
編集済み: mahdi 2025 年 9 月 23 日
Hi, I am not able to save the agent due to the warning "Warning: Unable to save the agent to the directory "savedAgents". Increase the disk space or check SaveAgentCriteriaValue in training options." However, I have sufficient space for at least 200 GB for the matlab directory.
How to solve it? Thanks in advance.
  2 件のコメント
Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2024 年 2 月 27 日
Do you have write access to that folder?
Yihao
Yihao 2024 年 2 月 29 日
Maybe because of this, I will try again. Btw, In the rlTrainingOptions, I modified the storage type as 'SimulationStorageType',"none" rather than using default "memory". And it works which also lowers the data size.

サインインしてコメントする。

回答 (3 件)

Yihao
Yihao 2024 年 2 月 29 日
編集済み: Yihao 2024 年 2 月 29 日
In the rlTrainingOptions, I modified the storage type as 'SimulationStorageType',"none" rather than using default "memory". And it works which also lowers the data size.
  2 件のコメント
Ari Biswas
Ari Biswas 2024 年 2 月 29 日
Hello Yihao,
The message is displayed if there were any issues when saving agents during training. It should not be related to setting the SimulationStorageType value, which saves simulation results (not agents) to memory or disk. If possible, please send some steps to reproduce the issue, such as how you created the agent, environment, and configured the training options. We will look into it. Thanks for using our tools.
Yihao
Yihao 2024 年 2 月 29 日
Hi, Ari. Sorry I cannot share the detailed code. Basically what I did is to run a simulation with around 20us for like 0.02s for each training episode. And the agent at round 4000th training episode cannot be saved even though my disk space is quite sufficient.
Thanks for your help.

サインインしてコメントする。


mahdi
mahdi 2025 年 8 月 2 日
Hi, I'm using MATLAB 2023b and having the same issue. the toolbox sometimes cannot save the agent automatically. even using the command save('trainedAgent.mat', 'agent'); results :
>> save('trainedAgent.mat','agent')
Error using save
Unable to save file 'D:\trainedAgent.mat'. The file could not be closed, and might
now be corrupt.
  2 件のコメント
Walter Roberson
Walter Roberson 2025 年 8 月 2 日
That problem can occur if the destination drive runs out of room.
That problem can also occur if the destination derive is mounted to OneDrive.
That problem can also occur if the destination drive is a network drive (especially if it is a NFSv2 drive; NFSv3 is less likely to have this problem.)
mahdi
mahdi 2025 年 8 月 2 日
Thanks for your participation, Dear Walter, I should mention that the destionation drive is neither mounted to OneDrive nor a network drive. I have 932GB free space left in the destination drive, which is a local disk (D:\), while the agent takes only 388~450 MB space.
In addition, my PC has 32GB RAM space and MATLAB is the only program running, so it's large enough and I don't think it loses agent data to save in the disk. there might be a bug with the toolbox. idk!

サインインしてコメントする。


mahdi
mahdi 2025 年 8 月 6 日
I finally figured out what the hell is wrong with it, reducing the TD3 agent experience replay buffer from 5M to 2.5M fixed the problem. I was wondering if windows security is preventing Matlab to save the agent. so I excluded the running .m file directory.
to add MATLAB exclusions:
  • Open Windows Security → Virus & threat protection
  • Manage settings → Add or remove exclusions
  • Add these exclusions:
  • Folder: Your MATLAB working directory (where savedAgents is)
  • Process: matlab.exe
  • File type: .mat
However, this didn't help either. then I lowered the experience buffer and this warning never showed up again.
  1 件のコメント
mahdi
mahdi 2025 年 9 月 23 日
編集済み: mahdi 2025 年 9 月 23 日
One more thing to add; if you do need to increase the experience replay buffer to avoid catastrophic forgetting of your RL agent, and you need more than 2.5e6 buffer size, like 5M for my case, there is a way to save the agent and avoid this error.
  • By default, MATLAB saves using the -v7 MAT-file format.
  • -v7 cannot handle arrays larger than 2^31-1 elements (~2.1e9) or ~2 GB per variable.
  • Your replay buffer (5 M × transitions × fields) crosses that threshold.
  • That’s why at 2.5 M it works, but at 5 M it fails.
Solution: Save using -v7.3 explicitly
-v7.3 uses HDF5 under the hood and can handle multi-GB data.
After training:
save('F:\RL_Agents\agent_with_buffer.mat','agent','-v7.3') % or wherever you want to save the trained agent
unfortunately, you cannot rely on autosaving during training, MATLAB still defaults to -v7. Hence, you may have no choice but to set:
trainOpts = rlTrainingOptions("SaveAgentCriteria","none",...
"saveAgentValue", "none")
by this way, you disable built-in autosaves and should manually save the agent after training.
I hope the Mathworks developer team will fix this issue for next Matlab versions. I've been having this issue untill Matlab 2024a.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

製品


リリース

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by