Access network during training with trainNetwork

3 ビュー (過去 30 日間)
Bradley Treeby
Bradley Treeby 2020 年 7 月 2 日
コメント済み: Bradley Treeby 2020 年 9 月 1 日
Is there any way to access the current network during training with trainNetwork? I am doing image-to-image training with a CNN and would like to use the current network with predict to make a more useful display while the training is running (e.g., by plotting the best and worst examples from the validation set).
I tried saving the current network state using the 'CheckpointPath' option of trainingOptions and then re-loading the latest checkpoint file after every epoch using a custom function set using 'OutputFcn'. This all works fine, however, it seems the checkpoint file can't be used with predict due to the batch normalisation layers (see here - I encounter the same error in R2020a).
I agree as suggested in the link above I could re-run trainNetwork with a tiny training set (1 image?) and a learning rate set to something very small. But logically, the current network must be available somewhere as MATLAB uses it to compute the current validation loss, presumably using some variant of predict.
Note, I don't actually care about using the checkpoint files, just accessing the current state of the network somehow.

採用された回答

Srivardhan Gadila
Srivardhan Gadila 2020 年 8 月 19 日
Based on the above information, I would suggest you to define/convert your network into dlnetwork & use custom training loop to train your network. dlnetwork has the forward & predict Object functions.
You can refer to documentation of dlnetwork & the example Train Network Using Custom Training Loop for more information.
  1 件のコメント
Bradley Treeby
Bradley Treeby 2020 年 9 月 1 日
Great thanks, just what I was looking for. This approach looks very powerful.

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

製品


リリース

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by