Warning: An error occurred while drawing the scene: Error in json_scenetree: Could not find node in replaceChild

88 ビュー (過去 30 日間)
Hong-Ruei Ciou
Hong-Ruei Ciou 2021 年 1 月 8 日
コメント済み: Jacob 2025 年 11 月 6 日 14:07
My matlab version is 2020(a).
It always calls the "Warning: An error occurred while drawing the scene: Error in json_scenetree: Could not find node in replaceChild" .
I had try the previous solution "opengl software" , but it didn't work.
This is the code I complie.
Is the code have problem or matlab problem.
How can I deal with this problem?
clear all
close all
clc
ATT = 1;
DATE = datestr(now, 30);
mdl = 'quarter_car';
open_system(mdl)
agentblk = [mdl '/RL Agent'];
numObs = 5;
numAct = 2;
observationInfo = rlNumericSpec([numObs 1],'LowerLimit',-inf*ones(numObs,1),'UpperLimit',inf*ones(numObs,1));
observationInfo.Name = 'observation';
actionInfo = rlFiniteSetSpec({[0;0],[0;1]});
actionInfo.Name = 'actor';
env = rlSimulinkEnv(mdl,agentblk,observationInfo,actionInfo);
Ts = 0.01;
Tf = 10;
rng(0)
statePath = [
imageInputLayer([numObs 1 1],'Normalization','none','Name','observation')
fullyConnectedLayer(50,'Name','CriticObsFC1')
reluLayer('Name','CriticRelu1')
fullyConnectedLayer(30,'Name','CriticObsFC2')];
actionPath = [
imageInputLayer([numAct 1 1],'Normalization','none','Name','action')
fullyConnectedLayer(30,'Name','CriticActFC1')];
commonPath = [
additionLayer(2,'Name','add')
reluLayer('Name','CriticCommonRelu')
fullyConnectedLayer(1,'Name','output')];
criticNetwork = layerGraph(statePath);
criticNetwork = addLayers(criticNetwork,actionPath);
criticNetwork = addLayers(criticNetwork,commonPath);
criticNetwork = connectLayers(criticNetwork,'CriticObsFC2','add/in1');
criticNetwork = connectLayers(criticNetwork,'CriticActFC1','add/in2');
criticOpts = rlRepresentationOptions('LearnRate',0.001,'GradientThreshold',1);
obsInfo = getObservationInfo(env);
actInfo = getActionInfo(env);
critic = rlQValueRepresentation(criticNetwork,obsInfo,actInfo,'Observation',{'observation'},'Action',{'action'},criticOpts);
agentOptions = rlDQNAgentOptions(...
'SampleTime',Ts,...
'TargetSmoothFactor',1e-3,...
'ExperienceBufferLength',1e6,...
'UseDoubleDQN',false,...
'DiscountFactor',0.5,...
'MiniBatchSize',64);
agentOpts.NoiseOptions.Variance = 0.5;
agentOpts.NoiseOptions.VarianceDecayRate = 1e-3;
agent = rlDQNAgent(critic,agentOptions);
maxepisodes = 2000;
maxsteps = ceil(Tf/Ts);
trainingOptions = rlTrainingOptions(...
'MaxEpisodes',maxepisodes,...
'MaxStepsPerEpisode',maxsteps,...
'ScoreAveragingWindowLength',5,...
'Verbose',false,...
'Plots','training-progress',...
'StopTrainingCriteria','AverageReward',...
'StopTrainingValue', 25000);
trainingStats = train(agent,env,trainingOptions);
simOptions = rlSimulationOptions('MaxSteps', maxsteps);
experience = sim(env,agent,simOptions);

回答 (2 件)

Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis 2021 年 1 月 8 日
My suspicion is that the error does not have much to do with the code you are showing but with how you create your environment. It seems you are using Simulink here - are you plotting anything in your Simulink model? Maybe trying to visualize what is happening with your system? That's what I would check first.

Chris
Chris 2025 年 9 月 28 日
I have the same problem, but I am not usung simulink. I have just downloaded 2025b and I get this error usualy after re-sizing a figure. Does not occur in older versions
  1 件のコメント
Jacob
Jacob 2025 年 11 月 6 日 14:07
I had the same error appear when trying to plot data that included NaNs. I had to restart matlab and get rid of the NaNs in my data. Then plot() worked.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeReinforcement Learning についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by