フィルターのクリア

Transfer learning without imresize or imageDatastore

4 ビュー (過去 30 日間)
Michael Benton
Michael Benton 2018 年 8 月 12 日
編集済み: Michael Benton 2018 年 9 月 4 日
For many samples of small images, it would be nice to load the data into RAM and perform transfer learning without using images on slower memory (imageDatastore). Unfortunately, if I had 50x50x1 images and had to resize to go with alexnet, I will be forced to use (50x50x1/227x227x3) ~ 1.6% of the number of samples in order to keep everything in RAM.
Does anyone know a fix? A custom layer that resizes would work, that's a lot of work though.
using 2017b

回答 (1 件)

Shounak Mitra
Shounak Mitra 2018 年 8 月 20 日
Hi Michael,
Thanks for your question.
If you need to resize, apply augmentedimagedatastore to your image datastore - it will be much faster than a custom readfcn, because it preserves prefetch under the hood. If you don’t need to resize, augment, or any other need for a custom readfcn, then vanilla imds is simplest.
Another option is to create a custom layer but you're right, it'll take some work.
HTH Shounak
  1 件のコメント
Michael Benton
Michael Benton 2018 年 9 月 2 日
編集済み: Michael Benton 2018 年 9 月 4 日
https://www.mathworks.com/help/nnet/ref/augmentedimagedatastore.html according to this, that can be done. I have updated to 2018a, and this works, thanks

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by