Error en código LSTM. Error in LSTM code
    1 回表示 (過去 30 日間)
  
       古いコメントを表示
    
Hola, el siguiente código arroja el siguiente error:
Hello, the following code throws the following error:

clc
clear
close all
A1 = 0.04;
A4 = 0.04;
B1 = 1.5e-4;
B4 = 1.5e-4;
Kp1 = 3.7E-6;
Kp4 = 3.7E-6;
g = 9.8;
v=[12.6101 1.0688 31.0125 37.5489 12.9138 71.4793 95.7939 32.9065 79.1094 67.4771 11.8518 72.6723 88.3776 1.0488 11.0924 37.6680 91.5542 99.8999 17.8789 18.0891 11.2040 61.0086 41.7771 26.3225 20.9668 19.5812 88.4935 63.5100 62.6202 66.1686 18.5314 62.9590 87.7691 18.6421 64.9118 88.5281 25.8011 59.0359 49.2004 84.0695 6.1159 74.2078	55.8971	43.5178	78.1925	40.2701	36.3657	66.2878	40.0230	18.6693	6.3752	25.4650	90.2795	98.2619	48.9268	15.3302	51.6550	99.3592	76.4943	96.6364	93.0582	24.4126	10.6547	41.5677	13.9861	75.7021	35.4622	45.5286	20.7342	73.9153	21.1334	77.0479	24.8867	61.7596	35.2314	72.5405	12.6051	18.8981	20.1926	36.6943	41.0350	23.3522	67.2367	95.7641	43.7348	80.5776	80.0207	99.5289	21.9686	33.5729	76.4718	75.9192	96.1664	57.4193	61.2421	37.1441	35.0112	60.3988	53.0373	49.0117];
p=[65.9094 4.7869 19.8024 92.9458 58.7150 2.5256 30.0739 5.4808	79.6784	71.4252	54.0998	59.0927	21.8838	41.9776	52.0662	83.9952	66.3312	48.6364	85.4237	48.6713	27.9573	73.3884	96.9775	38.1866	24.6214	25.0278	2.8750	98.2827	80.4593	56.7850	39.4941	65.6003	99.3771	71.6628	11.1343	45.2816	69.5867	33.9626	92.2531	63.6166	93.2864	43.1661	30.7485	89.0138	2.7200	15.6575	95.9571	71.7325	31.3402	82.9815	80.9813	91.0163	64.6385	63.2399	12.7286	91.9783	62.7741	26.5009	95.1791	5.5744	3.0718	9.7170	52.2664	87.9877	40.8157	56.5710	60.2174	8.4300	39.7575	36.1435	25.2341	11.6948	12.1166	27.4533	27.3728	93.6720	19.4315	51.2361	15.6118	92.1493	93.0167	14.5384	87.2854	2.2232	72.4816	87.9799	39.3572	25.3966	12.0606	84.8313	66.3062	78.1866	36.4977	19.2294	2.0078	9.5056	33.6173	30.7822	51.1505	39.2739];
sim_in = [v, p];
sim_out = [7.1311 6.3588 6.0545	7.8257	9.1521 8.0623	7.0667	6.2499	6.0917	6.2498	7.5273	7.2557	6.3734	7.3573	8.4913	9.8310	8.9261	7.8193	10.0257	10.6941	10.6282	10.7414	12.1362	12.1442	11.7505	11.4088	10.1702	10.6450	10.8248	10.4919	10.7776	10.6827	10.0315	11.5522	10.4712	9.5101	10.7533	10.1794	11.1625	10.3909	13.2449	12.4087	11.7282	12.7822	11.4824	10.6570	12.2692	12.0873	11.6660	13.5195	15.6611	17.3295	16.0747	14.6372	13.5176	15.7546	15.6847	14.2223	13.8570	12.4838	11.1820	10.2803	11.2498	12.3825	12.6999	12.0127	12.5235	11.4221	11.6250	10.7953	10.4960	9.4175	8.6992	8.0898	7.8539	8.0031	7.7384	8.6275	8.1134	9.7540	11.1313	10.4081	10.5331	9.3326	10.0847	9.6961	8.9038	7.8016	7.2000	8.8101	8.4298	8.2227	7.2229	6.6039	5.6859	5.0770	5.2510	4.9641	5.2480	5.3237];
v_Datastore = arrayDatastore(v);
p_Datastore = arrayDatastore(p);
out_Datastore=arrayDatastore(sim_out);
trainDatastore = combine(v_Datastore, p_Datastore, out_Datastore);
% ---------------CREATE END TRAIN NETWORK------------------
% Network architecture
numResponses = 1;
featureDimension = 1;
numHiddenUnits = 400;
maxEpochs = 400;
miniBatchSize = 300;
Networklayers = [sequenceInputLayer(featureDimension) ...
    lstmLayer(numHiddenUnits) ...
    dropoutLayer(0.02),...
    fullyConnectedLayer(numResponses) ...
    regressionLayer
    ];
options = trainingOptions('adam', ...
    'MaxEpochs',maxEpochs, ...
    'MiniBatchSize',miniBatchSize, ...
    'GradientThreshold',20, ...
    'Shuffle','once', ...
    'Plots','training-progress',...
    'LearnRateSchedule','piecewise',...
    'LearnRateDropPeriod',200,...
    'L2Regularization',1e-3,...
    'LearnRateDropFactor',0.5,...
    'Verbose',0,...
    'ValidationData',[{sim_in} {sim_out}]);
% ENTRENAMIENTO
net = trainnet(trainDatastore, Networklayers,options);
Falta un argumento en la llamada a la función pero no se cuál poner. Gracias.
There is a missing argument in the function call but I don't know which one to put. Thank you.
Si pongo trainNetwork el error es:
If I put trainNetwork the error is:
Error using trainNetwork (line 191)
Invalid validation data. Sequence responses must have the same sequence length as the
corresponding predictors.
Error in miso_directo (line 52)
net = trainNetwork(trainDatastore,Networklayers,options);
Tengo dos entradas de 100 elementos y una salida de 100 elementos.
I have two inputs of 100 elements and one output of 100 elements.
Gracias. Thanks.
0 件のコメント
回答 (1 件)
  Shivani
      
 2024 年 6 月 3 日
        
      編集済み: Shivani
      
 2024 年 6 月 3 日
  
      After analyzing the code snippet shared, it is my understanding that you are receiving an error while using the trainnet function because of the missing loss function parameter.
It is important to mention here that the regressionLayer added to the network is not recommended by MathWorks as mentioned in their documentation. You can find more information regarding this on the following MATLAB documentation link: https://www.mathworks.com/help/deeplearning/ref/regressionlayer.html
Instead of using a regressionLayer, you can modify the use of the trainNet function by specifying "mse" for the loss function. Below is an example code snippet illustrating how the NetworkLayers should be modified.
Networklayers = [sequenceInputLayer(featureDimension) ...
    lstmLayer(numHiddenUnits) ...
    dropoutLayer(0.02),...
    fullyConnectedLayer(numResponses) ...
    ];
You will need to provide the parameters to the trainNet function, as shown in the following code snippet.
net = trainnet(trainDatastore, Networklayers,"mse",options);
"Invalid validation data. Sequence responses must have the same sequence length as the corresponding predictors."
It is likely that you are receiving the above error when using the trainNetwork function because the length of sim_in and sim_out are not the same. After preprocessing and concatenating the variables 'v' and 'p', sim_in will be a vector of dimension 1x200, whereas the corresponding output, sim_out, will be a vector of dimension 1x100. This discrepancy is causing an error while training the network using trainNetwork.
Hope this helps!
0 件のコメント
参考
カテゴリ
				Help Center および File Exchange で Deep Learning Toolbox についてさらに検索
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

