Why doesn't my CNN fit into the memory of the GPU?
    3 ビュー (過去 30 日間)
  
       古いコメントを表示
    
Hi,
I tried to inflate the vgg16 to 3D using this function: https://se.mathworks.com/matlabcentral/fileexchange/87594-3d-convolutional-neural-network?s_tid=srchtitle . It worked fine (at least it didn't list any errors when using analyzeNetwork()), but when I try to train the network as batch job using GPU, it displays: 
Error using parallel.Job/fetchOutputs (line 1264)
An error occurred during execution of Task with ID 1.
Caused by:
    Error using trainNetwork (line 183)
    Maximum variable size allowed on the device is exceeded.
        Error using nnet.internal.cnngpu.convolveForwardND
        Maximum variable size allowed on the device is exceeded.
However, the GPU used is a tesla v100 with a 32GB of RAM. And if I use the command whos lgraph to read the size of the network, it only gives:
whos lgraph
  Name        Size                Bytes  Class                  Attributes
  lgraph      1x1             176553068  nnet.cnn.LayerGraph              
 But how come it is said that 2D vgg16 is already 500 MB, but my 3D vgg16 sits around at only 180 MB? How do I calculate the actual memory size of the network and what is needed at the GPU? My images (volumes) are of the size:
  whos v
  Name        Size                    Bytes  Class     Attributes
  v         224x224x224            44957696  single   
And my minibatchsize is 10. 
0 件のコメント
採用された回答
  Joss Knight
    
 2021 年 3 月 14 日
        
      編集済み: Joss Knight
    
 2021 年 3 月 14 日
  
      VGG16 is a 1GB model, if you inflate it to 3-D you're going to have very serious memory pressure. More to the point, the error you are getting is saying that the number of elements in some array is greater than the maximum allowed by gpuArray, which is 2147483647. The output of the first convolutional layer of VGG16 for a batch size of 10 is 224x224x64x10. If you trivially extend that to 3-D then the output will be 224x224x224x64x10, which is 7193231360 and so 3x bigger than the largest allowed variable size. If you reduced your batch size to 3 that would be possible - but then every activation in your network would still be 8GB in size and you would run out of memory pretty quickly.
2 件のコメント
  Joss Knight
    
 2021 年 3 月 15 日
				Sorry, this isn't my area of expertise. We have an example of 3-D semantic segmentation in our documentation that uses a U-net architecture. Otherwise, you may have to read around in the Deep Learning community, or ask that specific question to MATLAB Answers.
その他の回答 (0 件)
参考
カテゴリ
				Help Center および File Exchange で Deep Learning Toolbox についてさらに検索
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

