Reinforcement Learning Zero Reward

4 ビュー (過去 30 日間)
Evan Little
Evan Little 2021 年 5 月 13 日
回答済み: Ari Biswas 2021 年 5 月 13 日
I'm Training multiple reinforcement learning agents using a Simulink model with a custom function (to simulate a card game).
I can compile and run the model in Simulink with no problems, and attatching a scope to the reward and isdone signals show that they are set correctly (The reward is non-zero, and the isdone signal terminates the simulation at the correct time).
However, when I try to train the model, each agent shows zero reward.
Similar problems suggest that the problem may be with the isdone flag being set incorrectly, however I am confident that this is not the case, as each step outputs text into the command window (as desired), and so suggests that the model is simulating correctly during training.
To receate, run the 'CreateAgents' and 'CreateEnvironment' programs, open the 'WhistLearningVaribles.mat' file (containing necessary variables for the simulation and the training options), run 'myResetFunction', and train using the command: (Other functions must be present - the model references them during simulation)
stats = train([Player1, Player2, Player3, Player4],env,trainOpts);
Other functions must be present in the files structure - the model references them during simulation
Any advice would be much appreciated. Thanks!

採用された回答

Ari Biswas
Ari Biswas 2021 年 5 月 13 日
In your Simulink model workspace you have several agent objects saved with the same variable names as referenced in the RL Agent blocks. This is causing a conflict when resolving the agents during training. Remove these agent objects from the model workspace if you dont intend to use them. You can access the model workspace by pressing Ctrl+H from your Simulink model. Once you remove them you will be able to train correctly.

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

製品


リリース

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by