Dont need to save 'savedAgentResultStruct' with RL agent

When I am saving agents during RL iterations using 'EpisodeReward' criteria, matlab is also saving 'savedAgentResultStruct' along with the agent which is increasing the file size. Is there any option to turn off saving the 'savedAgentResultStruct' file.
Thanks

4 件のコメント

Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2021 年 2 月 23 日
Hello,
I am not sure savedAgentResultStruct is part of the standard way of saving info in RL Toolbox. Can you share the script you are using? Are you following a shipping example?
Sayak Mukherjee
Sayak Mukherjee 2021 年 2 月 23 日
Hi Emmanouil,
Thanks for your reply. I am first defining the agent ptions
agentOptions = rlTD3AgentOptions;
agentOptions.SampleTime = Ts;
agentOptions.DiscountFactor = 0.99;
agentOptions.MiniBatchSize = 100;
agentOptions.ExperienceBufferLength = 1e4;
agentOptions.SaveExperienceBufferWithAgent=false;
agentOptions.TargetSmoothFactor = 5e-3;
agentOptions.TargetPolicySmoothModel.Variance = 0.2; % target policy noise
agentOptions.TargetPolicySmoothModel.LowerLimit = -0.5;
agentOptions.TargetPolicySmoothModel.UpperLimit = 0.5;
agentOptions.ExplorationModel = rl.option.OrnsteinUhlenbeckActionNoise; % set up OU noise as exploration noise (default is Gaussian for rlTD3AgentOptions)
agentOptions.ExplorationModel.MeanAttractionConstant = 1;
agentOptions.ExplorationModel.Variance = 0.09;
Then define the agent
agent = rlTD3Agent(actor, [critic1,critic2], agentOptions);
and then define the training options
trainOpts = rlTrainingOptions(...
'MaxEpisodes',maxEpisodes,...
'MaxStepsPerEpisode',maxSteps,...
'ScoreAveragingWindowLength',250,...
'Verbose',true,...
'StopTrainingCriteria','AverageReward',...
'StopTrainingValue',250,...
'SaveAgentCriteria','EpisodeReward',...
'SaveAgentValue',50);
then training
trainingStats = train(agent,env,trainOpts);
All of these steps I have used directly from the bipedal robot example. Let me know if changing any of the parameters will stop saving the 'savedAgentResultStruct' file.
I donot need to save the file along with every agent, I only need to save it when the training stops, wich iam saving
save(['filename' datestr(now,'mm_DD_YYYY_HHMM')],'trainingStats','-v7.3')
M.Y. C.
M.Y. C. 2021 年 3 月 5 日
編集済み: M.Y. C. 2021 年 3 月 5 日
Unfortunately, there is no direct solution. Luckily, I found an indirect solution by manipulating reset function. My code includes getlatestfile.m function but not in function format. You may find the related code below, copy & paste the code as localResetFcn(in). The code takes apart your 'Agent__.mat' file into 2 .mat files: 'Agent__.mat' and 'savedAgentResultStruct.mat'. 'Agent__.mat' file contains only trained agent at given iteration. On the other hand, 'savedAgentResultStruct.mat' file is overwritten at every iteration. My code is not optimized, so you should set 'SaveAgentValue' at rlTrainingOptions to '-inf'. If you optimized the code, please share under this question.
function in = localResetFcn(in)
persistent a
if ~isempty(a)
%This function returns the latest file from the directory passsed as input
%argument
%Get the directory contents
dirc = dir('savedAgents');
%Filter out all the folders.
dirc = dirc(find(~cellfun(@isdir,{dirc(:).name})));
%I contains the index to the biggest number which is the latest file
[A,I] = max([dirc(:).datenum]);
if ~isempty(I)
latestfile = dirc(I).name;
% Simplify agent
FileName = latestfile
FolderName = 'savedAgents';
File = fullfile(FolderName, FileName)
load(File)
ResultFile = fullfile(FolderName, 'savedAgentResultStruct')
save(ResultFile,'savedAgentResultStruct')
save(File, 'saved_agent','-mat')
end
end
a = 1;
% Define random inital values as your simulation requires.
end
Sayak Mukherjee
Sayak Mukherjee 2021 年 12 月 2 日
Hi M.Y.C.
Were you able to optimize the code?

サインインしてコメントする。

回答 (0 件)

カテゴリ

ヘルプ センター および File ExchangeReinforcement Learning Toolbox についてさらに検索

製品

リリース

R2020b

質問済み:

2021 年 2 月 22 日

コメント済み:

2021 年 12 月 2 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by