Why is it that when I add an agent to simulink, I get an error indicating that I cannot change the properties

3 ビュー (過去 30 日間)
When I open an official case, such as by comparing pid and ddpg to control the height of water in the tank, the official original file can be run. But when I will remove the agent of the official case and manually add an "agent" myself, after clicking "Run" in the mlx file, the following error will be reported
ill usage rl.train.SeriesTrainer/run
'RL_dianyesifu_DDPG_test/RL Agent' 中出错: 无法计算封装初始化命令。
error rl.train.TrainingManager/train (第 479 行)
run(trainer);
erro rl.train.TrainingManager/run (第 233 行)
train(this);
erro rl.agent.AbstractAgent/train (第 136 行)
trainingResult = run(trainMgr,checkpoint);
reason:
ill usage rl.env.internal.reportSimulinkSimError
This cannot be changed while the simulation is running 'rlwatertank/RL Agent/RL Agent' the stats of 'Tunable'
I am a beginner, please give me some advice, thank you!
  2 件のコメント
Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2024 年 1 月 9 日
Did you update the agent variable everywhere? Make sure to update it on the RL Agent block in the Simulink model as well
Mxolisi
Mxolisi 2024 年 10 月 8 日
Hi Emmanouil, I am having the same problem. How do we update these variables?

サインインしてコメントする。

回答 (0 件)

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by