Load data into experience buffer: DDPG agent

I am using RL toolbox version 1.1 with Matlab R2019b and using the DDPG agent to design a controller. Is there a way to load in data (state, action, reward, next state) collected from real experiments into the experience buffer before startting training?

回答 (2 件)

JiaZheng Yan
JiaZheng Yan 2020 年 3 月 31 日

1 投票

I find a way to show the Memory of the experience buffer.
You can open the file "ExperienceBuffer.m", which is in "...\Matlab\toolbox\rl\rl\+rl\+util".
In this file, you can the property value of the variable Memory. For example:
Then you set:
agentOpts.SaveExperienceBufferWithAgent = true;
agentOpts.ResetExperienceBufferBeforeTraining = false;
After your training, you can get the data in ''agent.ExperienceBuffer.Memory''
This also means that you can modify and use the training data.
I hope this method works for you : )

8 件のコメント

Ao Liu
Ao Liu 2020 年 7 月 3 日
你好,我现在用的2020a的工具箱也遇到了这个问题,按照你的这样改,好像不行,请问有什么解决方法吗!感谢!
JiaZheng Yan
JiaZheng Yan 2020 年 7 月 4 日
我尝试了一下,Memory变量的属性设置在所示的两个位置均可实现其显示及调用。
记得在创建agent的训练选项预设的代码中,加入以下代码:
agentOpts.SaveExperienceBufferWithAgent = true; %(这一句是关键)
agentOpts.ResetExperienceBufferBeforeTraining = false;
训练完成后,你应该就可以看见Memory元素。
Memory的长度可能会影响数据显示,通过抽样赋值可以看见更详细的数据:
a=agent.ExperienceBuffer.Memory{1}%取一个元素查看
分别表示 (state, action, reward, next state,is_done)
保存这个agent,在下次的训练时加载agent,就可以实现agent的反复训练
Fabian Hart
Fabian Hart 2020 年 7 月 21 日
Thanks for your answer!
Unfortunately I have problems to write on Matlab system files. When I try to do that the following message appears:
"Error writing ExperienceBuffer.m .... Acces denied"
Could you please tell me how you managed that? (Windows 10)
JiaZheng Yan
JiaZheng Yan 2020 年 7 月 23 日
Sorry, I hope you can provide a more detailed description or a screenshot of the error report, because I have never made such an error.
(I guess it's a file path problem)
zhou jianhao
zhou jianhao 2021 年 10 月 25 日
Hi, Jiazheng.
Your method is working, thanks a lot!
Still one thing bother, is there any method that I can access the memory buffer during training, I mean if I want to use prioritized experience replay.
Hope to hear from you soon.
Thanks!
Regards!
zhou jianhao
zhou jianhao 2021 年 10 月 25 日
I have to say temperally the RL toolbox in matlab is easy to use but hard to obtain satisfactory performance, so far from python platform.
zimeng Wang
zimeng Wang 2022 年 4 月 1 日
您好,我可以用这个方法查看buffer里的数据,但是如何修改或者删除buffer里的数据?
Arman Ali
Arman Ali 2022 年 8 月 1 日
have you found the answer? if yes please guide?

サインインしてコメントする。

Priyanshu Mishra
Priyanshu Mishra 2020 年 2 月 26 日

0 投票

Hi Daksh,
You may find following link useful for your answer.

2 件のコメント

Daksh Shukla
Daksh Shukla 2020 年 2 月 26 日
Hello Priyanshu,
Thanks for your response.
However, the link does not exactly resolve the problem I am having. The link talks about running a lot of initial simulations and saving the agent with the experience buffer. But, what I would like to do is use data from "real experiments" and NOT simulations. I would like to add this data to the experience buffer or the replay memory to kick start the DDPG learning.
Based on all my reading and trying to access experience buffer in Matlab, it seems like experience buffer object is a hidden property and I cannot upload data to it directly from an external source.
I would really appreciate if you could let me know a direct way to upload data to the experience buffer, if there is one.
Francisco Serra
Francisco Serra 2023 年 12 月 20 日
Any updates on this? @Daksh Shukla?

サインインしてコメントする。

カテゴリ

ヘルプ センター および File ExchangeProgramming についてさらに検索

製品

リリース

R2019b

質問済み:

2020 年 2 月 23 日

コメント済み:

2023 年 12 月 20 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by