C-LSTM Input of 4D to predict 12X1 values

2 ビュー (過去 30 日間)
TROY TULLY
TROY TULLY 2021 年 3 月 17 日
回答済み: Srivardhan Gadila 2021 年 3 月 28 日
I am trying to use a C-LSTM to predict 12 values for each time step with regression.
I'm happy to provide as much information as I can. However, to me, it seems clear that I'm providing the correct inputs for the C-LSTM architecture...
Please help! This is important medical research!
Thank you all.
I tried it with two other formats, one sort of worked (RMSE = .54)
and then I tried it with a cell array around each variable such that the training and validation are just one cell array with a time series of 528*n and kinematics is 12*n and somehow that worked
layers = [ ...
sequenceInputLayer(inputSize,'Name','input')
sequenceFoldingLayer('Name','fold')
convolution2dLayer(filterSize,numFilters,'Name','conv')
batchNormalizationLayer('Name','bn')
reluLayer('Name','relu')
sequenceUnfoldingLayer('Name','unfold')
flattenLayer('Name','flatten')
lstmLayer(numHiddenUnits,'OutputMode','sequence','Name','lstm')
reluLayer('Name','relu2')
fullyConnectedLayer(numFeatures*2, 'Name','fc1')
reluLayer('Name','relu3')
fullyConnectedLayer(numFeatures*2, 'Name','fc2')
reluLayer('Name','relu4')
fullyConnectedLayer(numFeatures*2, 'Name','fc3')
fullyConnectedLayer(numClasses, 'Name','fcl')
regressionLayer('Name','regression')];
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph,'fold/miniBatchSize','unfold/miniBatchSize');
Here are my layers
inputsize is 528 1 1
filtersize is 64 1
numFilters is 50
numclasses is 12

回答 (1 件)

Srivardhan Gadila
Srivardhan Gadila 2021 年 3 月 28 日
Refer to the documentation of the Input Arguments: sequences & responses of the trainNetwork function for the syntax
net = trainNetwork(sequences,responses,layers,options) to know the format of the training data.
You can execute the following code to understand the data format:
inputSize = [528 1 1];
filterSize = [64 1];
numFilters = 50;
numClasses = 12;
numHiddenUnits = 200;
numResponses = numClasses;
numFeatures = 10;
layers = [ ...
sequenceInputLayer(inputSize,'Name','input')
sequenceFoldingLayer('Name','fold')
convolution2dLayer(filterSize,numFilters,'Name','conv')
batchNormalizationLayer('Name','bn')
reluLayer('Name','relu')
sequenceUnfoldingLayer('Name','unfold')
flattenLayer('Name','flatten')
lstmLayer(numHiddenUnits,'OutputMode','sequence','Name','lstm')
reluLayer('Name','relu2')
fullyConnectedLayer(numFeatures*2, 'Name','fc1')
reluLayer('Name','relu3')
fullyConnectedLayer(numFeatures*2, 'Name','fc2')
reluLayer('Name','relu4')
fullyConnectedLayer(numFeatures*2, 'Name','fc3')
fullyConnectedLayer(numClasses, 'Name','fcl')
regressionLayer('Name','regression')];
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph,'fold/miniBatchSize','unfold/miniBatchSize');
analyzeNetwork(lgraph)
%%
numTrainSamples = 50;
trainData = arrayfun(@(x)rand([inputSize(:)' 1]),1:numTrainSamples,'UniformOutput',false)';
trainLabels = arrayfun(@(x)rand(numResponses,1),1:numTrainSamples,'UniformOutput',false)';
size(trainData)
size(trainLabels)
%%
options = trainingOptions('adam', ...
'InitialLearnRate',0.005, ...
'LearnRateSchedule','piecewise',...
'Verbose',1, ...
'Plots','training-progress');
net = trainNetwork(trainData,trainLabels,lgraph,options);

カテゴリ

Help Center および File ExchangeSequence and Numeric Feature Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by