Optimizing the GRU training process using Bayesian shows errors

9 ビュー (過去 30 日間)
Yuanru Zou
Yuanru Zou 2023 年 11 月 16 日
コメント済み: Yuanru Zou 2023 年 11 月 22 日
Hi all, I'm having a problem with optimizing GRU parameters using Bayesian optimization, the code doesn't report an error, but some iterations of the Bayesian optimization process show ERROR. What should I do about it? Can you help me out, I would greatly appreciate it if you could help me out.

採用された回答

Alan Weiss
Alan Weiss 2023 年 11 月 16 日
The error is coming from your code. Apparently, some points visited (that have, for example, NumOfUnits = 30, InitialLearnRate = 0.8 or 0.2, L2Regularization = 0.0048 or 7.5e-6) give NaN results to your objective function or nonlinear constraint functions.
You can test this outside of bayesopt to see where your code returns NaN.
If your code is running as expected, then there is nothing wrong with ignoring the iterations that lead to errors.
Alan Weiss
MATLAB mathematical toolbox documentation
  5 件のコメント
Alan Weiss
Alan Weiss 2023 年 11 月 21 日
I'm sorry, but I don't know much about deep learning, so I don't think that I can help you with your code. It looks like you are training a neural network and optimizing it to get a minimal mean squared error. I don't see anything obviously wrong, but then again I don't know what would cause the network training process or something else to throw an error. Usually in these systems, there is so much random going on (from the stochastic gradient descent to the data collection process) that things can get noisy or fail for a variety of reasons. In your case, I really don't know.
Sorry.
Alan Weiss
MATLAB mathematical toolbox documentation
Yuanru Zou
Yuanru Zou 2023 年 11 月 22 日
Okay, thanks for your patience and help, I hope you're doing well at work and in good health!

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeAI for Signals についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by