I am getting this error when I try to train a TD3 RL agent.
Thanking You
Apoorv Pandey

1 件のコメント

Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2023 年 3 月 24 日
If you share a reproduction model it would be easier to debug

サインインしてコメントする。

 採用された回答

Cris LaPierre
Cris LaPierre 2023 年 3 月 24 日

0 投票

When defining your rlQValueFunction, include the ActionInputNames and OvservationInputNames name-value pairs.
% Observation path layers
obsPath = [featureInputLayer( ...
prod(obsInfo.Dimension), ...
Name="netObsInput")
fullyConnectedLayer(16)
reluLayer
fullyConnectedLayer(5,Name="obsout")];
% Action path layers
actPath = [featureInputLayer( ...
prod(actInfo.Dimension), ...
Name="netActInput")
fullyConnectedLayer(16)
reluLayer
fullyConnectedLayer(5,Name="actout")];
%<snip>
critic = rlQValueFunction(net,...
obsInfo,actInfo, ...
ObservationInputNames="netObsInput",...
ActionInputNames="netActInput")

2 件のコメント

Apoorv Pandey
Apoorv Pandey 2023 年 3 月 27 日
I have used the exact same code as mentioned in the link and still getting the error. Please help
Cris LaPierre
Cris LaPierre 2023 年 3 月 27 日
Please share your data and your code. You can attach files using the paperclip icon. If it's easier,save your workspace variables to a mat file and attach that.

サインインしてコメントする。

その他の回答 (0 件)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by