Gradient clipping with custom feed-forward net

10 ビュー (過去 30 日間)
Christoph Aistleitner
Christoph Aistleitner 2021 年 7 月 28 日
回答済み: Artem Lensky 2022 年 12 月 4 日
Everytime I am training my custom feed-forward net with 2 inputs and one output( timeseries) with the train(net,....) function:
after ~10 training epochs the value of the gradient reaches the prestet value and the training stops.
Changing the networks architecture is not an option in my case.
Is there a way to implement "gradient clipping" with a feed-forward net?
Or is there any other workaround for the "exploding gradient"?
  1 件のコメント
Christoph Aistleitner
Christoph Aistleitner 2021 年 7 月 28 日
*gradient reaches the preset maximum

サインインしてコメントする。

採用された回答

Vineet Joshi
Vineet Joshi 2021 年 9 月 1 日
Hi
The following documentation link will provide you suitable details regarding dealing with exploding gradients in MATLAB.
Hope this helps.
Thanks
  1 件のコメント
Artem Lensky
Artem Lensky 2022 年 12 月 4 日
The answer you provided is not for a custom loop. See this example https://au.mathworks.com/help/deeplearning/ug/train-network-using-custom-training-loop.html there is the following line
[loss,gradients,state] = dlfeval(@modelLoss,net,X,T);
The question is how to apply clipping to gradients. Is there are standard Matlab function can do this for me or should I implement it myself.

サインインしてコメントする。

その他の回答 (1 件)

Artem Lensky
Artem Lensky 2022 年 12 月 4 日
Please check this link that illustrates several examples on how to implement training options that you would usually define via trainingOptions() and use with trainNetwork() but for customs loops. Here is an L2 clipping example given in the link above
function gradients = thresholdL2Norm(gradients,gradientThreshold)
gradientNorm = sqrt(sum(gradients(:).^2));
if gradientNorm > gradientThreshold
gradients = gradients * (gradientThreshold / gradientNorm);
end
end
You might also find this link useful https://au.mathworks.com/help/deeplearning/ug/detect-vanishing-gradients-in-deep-neural-networks.html that discuss detection of vanishing gradients in deep neural networks.

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

製品


リリース

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by