Error using +. Matrix dimensions must agree

Hi, I'm trying to create neural network for train my data. But I get error. I though its right but which one should I fix. Here my code
%clc;
close all;
clearvars;
% Multilayer Perceptron (MLP) neural networks.
N = 220; %sum of data
D = 25 ; %dimension
K = 11; %target
h = 500; %sum of neuron
load datatrain1.mat
Train1= cell2mat(Train);
load classtrain1.mat
CTrain1 = cell2mat(TTrain);
b1 = randn(1,h);%bias from input to hidden
W1 = randn(D,h);%weight from input to hidden
b2 = randn(1,K);%bias from hidden to output
W2 = randn(h,K);%weight from hidden to output
%feedfoward
for efoch = 1 : 1000
H = Train1*W1+b1;
Y = H*W2+b2;
end
I get error in variable H, like this Error using +. Matrix dimensions must agree.
Any help for fix my code will must appreciated. Thanks

2 件のコメント

madhan ravi
madhan ravi 2018 年 11 月 11 日
upload .mat files
Oman Wisni
Oman Wisni 2018 年 11 月 11 日
yes sir, I already attached my file .mat. But I'm not sure it can open in matlab, cause before I ever upload and download, when I download the format of file is different. And when I upload I no get reply.

サインインしてコメントする。

 採用された回答

Guillaume
Guillaume 2018 年 11 月 11 日
編集済み: Guillaume 2018 年 11 月 11 日

1 投票

The error message is very clear, you're trying to add two matrices of different size, which is never possible. Since b1 is a 1xh vector, Train1*W1 must be a 1xh vector, and since W1 is a Dxh matrix, Train1 must be a 1xD vector for the product to be 1xh.
Therefore, your train1 is not 1xD. There is nothing we can do about that. Either use a train1 matrix that is 1xD or change your D to reflect the size of train1.
Note that it is never a good idea to hardcode the size of matrices. It is always safer to ask matlab for the actually size:
D = size(train1, 2);
would ensure it always match train1. Of course, train1 must only have one row.
edit: got confused between D and h but see comment below.

10 件のコメント

Oman Wisni
Oman Wisni 2018 年 11 月 11 日
編集済み: Oman Wisni 2018 年 11 月 11 日
Yes sir, if I change h or train1. It is impossible cause it value from feature extraction and h is the number of single hidden layer with 500 neuron. b is bias. I get in theory the formula like that.
I confused too. Is there I wrong declarad bias and weight? But the example I get from this forum
Stephen23
Stephen23 2018 年 11 月 11 日
"But the example I get from this forum"
Please give a link to the thread where you got this from.
Oman Wisni
Oman Wisni 2018 年 11 月 11 日
編集済み: madhan ravi 2018 年 11 月 11 日
Guillaume
Guillaume 2018 年 11 月 11 日
Actually, my reasoning above was a bit wrong, at least for versions >R2016b. Since the implicit expansion introduced in R2016b, the sum with work as long as the product train1*W1 has h columns, the number of rows doesn't matter. Since the product is guaranteed to have h columns or fail, the only reason why the sum would fail for you is that you're using a version earlier than R2016b.
In version prior to R2016b you had to use bsxfun for explicit expansion, so:
H = bsxfum(@sum, train1*W1, b1);
Most likely, you'll have to do the same for the next line.
Oman Wisni
Oman Wisni 2018 年 11 月 11 日
Yes, I using R2015a.
What mean bsxfum? And if I run it, the result will same?
Guillaume
Guillaume 2018 年 11 月 11 日
documentation of bsxfun. In this particular case, it means that the input b1 will be replicated to match the height of train1*W1. Yes, the result will be the same as you would have got in later versions with implicit expansion.
Oman Wisni
Oman Wisni 2018 年 11 月 11 日
Oke sir, I will trying. Thank you
Oman Wisni
Oman Wisni 2018 年 11 月 12 日
Sir, what code is right ?
H = bsxfun(@sum, train1*W1, b1);
or
H = bsxfun(@plus, Train1*W1, b1);
using @sum I get error, but when I using @plus I get result. My question is, if I using @plus there is nothing change in the result? thanks
Guillaume
Guillaume 2018 年 11 月 12 日
Sorry, it should have been @plus indeed.
Oman Wisni
Oman Wisni 2018 年 11 月 12 日
Yes sir. Thanks

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

ヘルプ センター および File ExchangeDeep Learning Toolbox についてさらに検索

質問済み:

2018 年 11 月 11 日

コメント済み:

2018 年 11 月 12 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by