Why does AlexNet train slower and use much more memory with "simpler" brain CT images?
1 回表示 (過去 30 日間)
古いコメントを表示
Matlab 2017a, Windows, GPU
I have used transfer learning with AlexNet. I retrained it to classify 3 types of CT brain abnormalities by changing the last fully connected layer and using the trainNetwork function. As image input I converted the CT brain scans to uint8 tiff images.( pixel value ranges of 0-255) As a result surrounding air and scalp fat were set to 0, which is water density on CT images. Bone became 255. The brain tissues, both normal and abnormal, were unchanged in the 1-85 range. This worked well with minibatch accuracies of ~0.98 and test batch accuracies of ~0.95.
I then thought that I could improve the network by removing the bone/skull in image preprocessing. So I wrote a function which set the bone/skull to 0 (instead of 255). I assumed that this would allow the brain tissues and abnormalities to extend over a greater range in the normalized images. Now when I attempt to retrain AlexNet, the iterations take about 5 times longer, system memory usages maxes out (63 GB is available), and Matlab freezes. Any idea why?
2 件のコメント
Joss Knight
2018 年 4 月 28 日
The failure to converge as fast is not a particular surprise - perhaps there simply isn't sufficient information in your dataset to do the classification now (i.e. the bone/water distinction was essential). As to why you max out system memory, that sounds like a memory leak. Are you able to upgrade MATLAB to a more recent version?
回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Image Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!