clamp cross-entropy loss

3 ビュー (過去 30 日間)
Matt Fetterman
Matt Fetterman 2020 年 9 月 3 日
コメント済み: Matt Fetterman 2020 年 9 月 6 日
the Matlab cross-entropy loss has this form:
loss = -sum(W*(T.*log(Y)))/N;
I would like to "clamp" it so that the log function output is bounded, for example it cannot be less than 100.
Can we do it?

採用された回答

David Goodmanson
David Goodmanson 2020 年 9 月 3 日
編集済み: David Goodmanson 2020 年 9 月 6 日
Hi Matt,
z = log(Y);
z(z<100) = 100;
loss = -sum(W*(T.*z))/N;
In the link you provided, they talk about a limit of -100 rather than +100. The former appears to make more sense. Lots of possibilities for a smooth differentiable cutoff, here is one, assuming Y>=0
Ylimit = -100;
loss = -sum(W*(T.*log(Y+exp(Ylimit)))/N;
  3 件のコメント
David Goodmanson
David Goodmanson 2020 年 9 月 6 日
Hi Matt,
see amended answer.
Matt Fetterman
Matt Fetterman 2020 年 9 月 6 日
probably a smart approach.

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeMatrix Indexing についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by