gradient descent for custom function
5 ビュー (過去 30 日間)
古いコメントを表示
I have four equations:
1) e = m - y
2) y = W_3 * h
3) h = z + W_2 * z + f
4) f = W_1 * x
I want to update W_1, W_2 and W_3 in order to minimize a cost function J = (e^T e ) by using gradient descent.
x is an input, y is the output and m is the desired value for each sample in the dataset
I would like to do W_1 = W_1 - eta* grad(J)_w_1
W_2 = W_2 - eta* grad(J)_w_2
W_3 = W_3 - eta* grad(J)_w_3
Going through documentation I found out that you can train standard neural networks. But notice that I have some custom functions, so I guess it would be more of an optimization built in function to use.
Any ideas?
2 件のコメント
Matt J
2024 年 4 月 24 日
x is an input, y is the output and m is the desired value for each sample in the dataset
It looks like z is also an input. It is not given by any other equations.
回答 (2 件)
Matt J
2024 年 4 月 24 日
編集済み: Matt J
2024 年 4 月 24 日
so I guess it would be more of an optimization built in function to use.
No, not necessarily. Your equations can be implemented with fullyConnectedLayers and additionLayers.
3 件のコメント
参考
カテゴリ
Help Center および File Exchange で Sequence and Numeric Feature Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!