Why is it that when I add an agent to simulink, I get an error indicating that I cannot change the properties

5 ビュー (過去 30 日間)
cm s
cm s 2023 年 12 月 29 日
回答済み: Harsh 2025 年 3 月 21 日
When I open an official case, such as by comparing pid and ddpg to control the height of water in the tank, the official original file can be run. But when I will remove the agent of the official case and manually add an "agent" myself, after clicking "Run" in the mlx file, the following error will be reported
ill usage rl.train.SeriesTrainer/run
'RL_dianyesifu_DDPG_test/RL Agent' 中出错: 无法计算封装初始化命令。
error rl.train.TrainingManager/train (第 479 行)
run(trainer);
erro rl.train.TrainingManager/run (第 233 行)
train(this);
erro rl.agent.AbstractAgent/train (第 136 行)
trainingResult = run(trainMgr,checkpoint);
reason:
ill usage rl.env.internal.reportSimulinkSimError
This cannot be changed while the simulation is running 'rlwatertank/RL Agent/RL Agent' the stats of 'Tunable'
I am a beginner, please give me some advice, thank you!
  2 件のコメント
Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2024 年 1 月 9 日
Did you update the agent variable everywhere? Make sure to update it on the RL Agent block in the Simulink model as well
Mxolisi
Mxolisi 2024 年 10 月 8 日
Hi Emmanouil, I am having the same problem. How do we update these variables?

サインインしてコメントする。

回答 (1 件)

Harsh
Harsh 2025 年 3 月 21 日
Hi @cm s,
Assuming you are referring to the “mlx” file available with the official example from the following documentation page -
To use your own agent in the above example you can load it programmatically from a “mat” file and then modify the “sim” function to use that agent. Below are the changes that can be used to use a different agent than the one given in example.
In the “Validate Trained Agent” section –
rng(1)
loadAgentData = load("myAgent.mat");
myAgent = loadAgentData.myAgent;
Then use this “myAgent” with the “sim” function –
experiences = sim(env,myAgent,simOpts);
Also modify the “RL Agent” block in the model by double clicking on it and then enter your agent in the “Agent object” field. Note that you need to first load the agent in base workspace to use it in the “RL Agent” block. Below is the snapshot of “Block Parameters” window for “RL Agent” block –
If you want to understand why you got an error, please share the changes that you made to the "mlx" file before running it.

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by