Create and Train DQN Agent with just a State Path and Not Action Path

1 回表示 (過去 30 日間)
Every example I have seen of a DQN on MATLAB is with two inputs, the state and action. However, it is possible for DQN RL to be done with just one input, the state but there are no examples for that case. How can that be done on MATLAB? My Input would basically be a binary vector and my output would be that I can do two actions?

採用された回答

Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2020 年 7 月 6 日
Hello,
This page shows how this can be done in 20a. We will have examples that show this workflow in the next release.
Hope that helps.
  9 件のコメント
Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2020 年 7 月 6 日
This sounds doable.You may even be able to do this without custom loops using built-in agents (something like centralized multi-agent RL). You can use a single agent and at each step extract the appropriate action and apply it to the appropriate part of the environment. The tricky part is (typical of multi-agent RL) to pick the right amount of observation to make sure your process is Markov. This will likely require observations from each 'subagent' etc.
Huzaifah Shamim
Huzaifah Shamim 2020 年 7 月 6 日
Oh ok interesting.
One last question (Hopefully), in this link, LoggedSignals is used as the State. However lets say I wanted LoggedSignals to contain three matrices or three vectors, one of them being the actual state. Could I contain those three matrices in LoggedSignal and then just let observation have the actual next state. Additionally if that is allowed, how would I define my observation info, would I use rlFiniteSetSpec or rlNumericSpec (My offical state would just be a binary vector of size N but I want to have two other matrices, x and y lets say, so that I can utlilze them in myStepFunction.m)

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

製品


リリース

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by