CNN Deep learning: Data size Vs Iteration per epoch

9 ビュー (過去 30 日間)
Gobert
Gobert 2020 年 9 月 17 日
コメント済み: Gobert 2020 年 9 月 21 日
I need your help to understand why the "data size" affects the number of "iteration per epoch". See below A and B.
(A) (B)
With the number of iterations per epoch, shown in figure A, the training data size = 3700 images. With the number of iterations per epoch, shown in figure B, the training data size = 57000 images. I did not change any settings in my CNN network (and, in both cases, the input images had the same size). Can you please explain why changing (or increasing) the data size has increased the number of iteration per epoch? In other words, what is the relationship between data size and number of iteration per epoch?
  2 件のコメント
Ritu Panda
Ritu Panda 2020 年 9 月 21 日
Iterations per epoch depends on the number of training sample that the model is trained on in each epoch.
For each epoch, your training data is divided into batches of data (specified by the miniBatchSize parameter in the options argument). The model trains on every batch and updates the weight parameters.
Hence, Iterations per epoch = Number of training samples ÷ MiniBatchSize
i.e., In how many iterations in a epoch the forward and backward pass takes place during training the network.
Gobert
Gobert 2020 年 9 月 21 日
I agree with you, @Ritu Panda

サインインしてコメントする。

回答 (0 件)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by