Neural Network training - number of observations in X and Y disagree

2 ビュー (過去 30 日間)
Tajwar Choudhury
Tajwar Choudhury 2020 年 3 月 28 日
コメント済み: Tajwar Choudhury 2020 年 3 月 31 日
I have created a database of a combination of sine waves with random periods and amplitudes and added random noise to each one to create a set of "clean" data and "noisy" data and hope to train a simple feedforward net to denoise the noisy signals. My training data, the noisy signals are in a 301x10000 double array, where each column corresponds to a single wave, 301 being the length of time and 10000 being the number of random signals. The clean signals are in the exact same format, 301x10000 double array. The first clean signal in the array corresponds to the first noisy signal in the training data, i.e. the first noisy signal in the training dataset is a noisy version of the first clean signal
My network structure is simple, an image input layer, 3 fully connected layers with tanh activations and a regression output. I know I need to reshape the data using the reshape function but I'm unsure how - what I don't get is why it says the number of observations in the training data and target data disagree when they're both the same dimension?
Essentially:
trainingData is a 301x10000 double array of noisy sine waves
trainingTargets is a 301x1000 double array of the same sine waves but without the noise
The imagine input layer has dimensions of [1 301]
When feeding into the net using the trainNetwork function, I get number of observations in X and Y disagree
  1 件のコメント
Adam Danz
Adam Danz 2020 年 3 月 28 日
Could you put together a minimal working example that reproduces the problem?

サインインしてコメントする。

回答 (1 件)

Srivardhan Gadila
Srivardhan Gadila 2020 年 3 月 31 日
Refer to Train Convolutional Neural Network for Regression and check the sizes of XTrain & YTrain to reshape your data accordingly.
The following code might help you:
layers = [imageInputLayer([301 1 1]) fullyConnectedLayer(500) fullyConnectedLayer(301) regressionLayer];
trainData = randn([301 1 1 1000]);
trainLabels = randn([1000 301]);
options = trainingOptions('adam', ...
'InitialLearnRate',0.005, ...
'LearnRateSchedule','piecewise',...
'MaxEpochs',300, ...
'MiniBatchSize',1024, ...
'Verbose',1, ...
'Plots','training-progress');
net = trainNetwork(trainData,trainLabels,layers,options);
  1 件のコメント
Tajwar Choudhury
Tajwar Choudhury 2020 年 3 月 31 日
Thank you for the response. What I ended up doing which worked is the following:
trainingTargets2 = reshape(trainingTargets, [1 1 size(trainingTargets,1) size(trainingTargets,2)]);
trainingData2 = reshape(trainingData, [size(trainingData,1) 1 1 size(trainingData,2)]);
validationTargets2 = reshape(validationTargets, [1 1 size(validationTargets,1) size(validationTargets,2)]);
validationData2 = reshape(validationData, [size(validationData,1) 1 1 size(validationData,2)]);
Using those as X,Y and X,Y Validation
Where trainingTargets are the generated clean waves, trainingData are the generated clean waves with added noise, and the same for the validation sets

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeDeep Learning Toolbox についてさらに検索

製品


リリース

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by