MATLAB Answers

Out of memory issue while training a Neural Network (NN), array exceeds maximum array size preference using backpropJacobianStatic

5 ビュー (過去 30 日間)
Timothee Fichot
Timothee Fichot 2020 年 7 月 16 日
編集済み: Timothee Fichot 2020 年 7 月 21 日
Hello, this is my first time asking a question here, I will try to be brief and clear !
I am currently trying to train a NN of 2 hidden layers with 256 neurones both, in input and output i have a 22*size(trainSet) data set. This represent an amout of 77334 weigth + biais, this shouldn't be a problem to train a NN like that since I saw on some post people training much larger NN. But the issue is that when I call the train function somewhere inside the matlab code (in the backpropJacobianStatic) there is a matrix multiplication that create an array of size 77334*77334 (77334 the number of weight), and that takes all the memory creating an out of memory issue (picture bellow of the issue) :
My question is the following : is there a way not to create this matrix of size numberOfWeight*numberOfWeight that take all the memory during the training ? Because I don't really understand why we would need to store this array since we only need a 1*77334 size array to store the weight, no ?
Thank you in advance for your answers, and if you have any question or if I wasn't clear feel free to ask me for more information !

  0 件のコメント

サインインしてコメントする。

採用された回答

Timothee Fichot
Timothee Fichot 2020 年 7 月 21 日
編集済み: Timothee Fichot 2020 年 7 月 21 日
If someone else has the same issu I found a way to solve it : just dont use the default training method which is the Levenberg-Marquardt backpropagation one. This method requires to comput the Jacobian matrix which in my case was too large. To avoid that you can use e.g. trainscg (or any other solver that does not require the jacobian) which does not require to comput the Jacobian matrix but only the gradient, which is much smaler !

  0 件のコメント

サインインしてコメントする。

その他の回答 (2 件)

Greg Heath
Greg Heath 2020 年 7 月 17 日
A single hidden layer is sufficient.
Hope this helps
Thank you for formally accepting my answer
Greg

  1 件のコメント

Timothee Fichot
Timothee Fichot 2020 年 7 月 17 日
Hello greg,
Indeed using only one layer will solve my issue, but I'm looking for a way to use 2 layers.
Thank you for your answer though !

サインインしてコメントする。


Greg Heath
Greg Heath 2020 年 7 月 17 日
A single hidden layer is sufficient.
Hope this helps
Thank you for formally accepting my answer
Greg

  0 件のコメント

サインインしてコメントする。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by