Why sim() can not read pretrained dqn-agent ?

1 回表示 (過去 30 日間)
Kun Cheng
Kun Cheng 2023 年 12 月 23 日
回答済み: Sulaymon Eshkabilov 2023 年 12 月 23 日
Hi,
I have trained an agent and want to simulate it by sim(). But there is an error:
Error using rl.env.AbstractEnv/sim
Invalid argument at position 2. Value must be of type rl.policy.PolicyInterface or be convertible to rl.policy.PolicyInterface.
Error in Simulation/Validieren (line 44)
simResults = sim(self.env, self.agent);
Error in Simulation (line 22)
simResults = Validieren(self);
following code is performed:
methods
function self = Simulation()
A = Punkten();
self.PredictedData = cell(size(A.Divide_In_Volume, 1), ...
size(A.Divide_In_Volume, 2));
agent_test = load('agent_test.mat');
self.agent = agent_test;
li = 1;ui = size(A.Divide_In_Volume, 1);
lj = 1;uj = size(A.Divide_In_Volume, 2);
for i= li : ui
for j= lj : uj
self.env = PointsDistanceEnv(i, j, A);
simResults = Validieren(self);
self.PredictedData{i, j} = getPrediction(self, simResults);
% TotalTree
self.Total_Tree{i, j} = PlotDaten(self, ...
self.PredictedData{i, j}, A.Divide_In_Volume{i, j},...
B.R, C.TotalTree);
end
end
end
function simResults = Validieren(self)
% data from simulation
rng(2)
sim_Env = self.env;sim_Agent = self.agent;
simResults = sim(sim_Env, sim_Agent);
end
end
Information about training:
function Trainieren(self)
% just some hyperparameter
trainOpts = rlTrainingOptions(...
MaxEpisodes = self.HP(9), ...
MaxStepsPerEpisode = self.HP(10), ...
Verbose=false, ...
ScoreAveragingWindowLength=100,...
Plots="none",...
StopTrainingCriteria="EpisodeCount",...
StopTrainingValue=self.HP(9));
%Plots="training-progress",...
% logger
fileLogger = rlDataLogger();
fileLogger.AgentLearnFinishedFcn = @AgentLearnLoggingFcn;
fileLogger.EpisodeFinishedFcn = @EpisodeLoggingFcn;
%fileLogger.AgentStepFinishedFcn = @AgentStepFinishedFcn;
fileLogger.LoggingOptions.FileNameRule = "LR<id>";
trainingStats = train(self.agent, self.env, trainOpts, Logger=fileLogger);
%filename = sprintf('agent_%d_%d.mat', i, j);
% filename = ['agent_LR=' num2str(LearnRate) '.mat'];
filename = 'agent_test.mat';
agent_test = self.agent;
save(filename, "agent_test")
% filename = 'trainingStats_test.mat';
% save(filename, "trainingStats")
end
The sim() in function Validieren(self) can not read this DQN-Agent.
I don't know why it happens.
getPrediction() und PlotDaten() functions are inrelevant here.
Thanks a lot
Kun

採用された回答

Sulaymon Eshkabilov
Sulaymon Eshkabilov 2023 年 12 月 23 日
Note that sim() function is used to simlate Simulink models not M-files or MLX-files - see DOC. To call and execute M or MLX files including fucntion files, just call it from the command or M-file editor or use run() command.

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by