ExperienceBuffer has 0 Length when i load a saved agent and continue training in reinforcement training

8 ビュー (過去 30 日間)
Hi all,
I'm trying to train a saved agent further. In the training option of this saved agent, the SaveExperienceBufferWithAgent is set to true. But when I load the saved_agent and open the property ExperienceBuffer I noticed the Length is 0. I tried to look in the documentation of such property but the there is no information on it. If I stop a training and directly check the property "Length" of the agent in the workspace, it has some value.
My question would be what does this "Length" mean? If it's 0, when I perform training further with a saved agent like in https://de.mathworks.com/matlabcentral/answers/495436-how-to-train-further-a-previously-trained-agent?s_tid=answers_rc1-2_p2_MLT , does it really continue training with saved agent and with saved expeirence buffer?
Yours

採用された回答

Takeshi Takahashi
Takeshi Takahashi 2021 年 4 月 20 日
Length 0 means there isn't any experience in this buffer. I think it didn't save the experience buffer due to this bug. Please set agent.AgentOptions.SaveExperienceBufferWithAgent = true immediately before saving the agent.
  2 件のコメント
Yikai
Yikai 2021 年 4 月 20 日
編集済み: Yikai 2021 年 4 月 20 日
thanks for answering. I will try this
Dmitriy Ogureckiy
Dmitriy Ogureckiy 2023 年 1 月 12 日
Can I ask you, does networks weights saved when agent saved between simulations?

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

製品


リリース

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by