How to designthe SAGAN self attention layer?

7 ビュー (過去 30 日間)
Michael Keeling
Michael Keeling 2020 年 7 月 9 日
コメント済み: fneg 2021 年 7 月 3 日
I'm curious if the self-attention layer described in this paper could be implemented in MATLAB. Would it require a custom layer to construct? Does not seem feasible inside the Deep Network Designer. I'd appreciate a pointer in the right direction.
Thank you!
  1 件のコメント
fneg
fneg 2021 年 7 月 3 日
Hi, have you been able to implement this attention block in matlab?

サインインしてコメントする。

回答 (1 件)

Divya Gaddipati
Divya Gaddipati 2020 年 7 月 23 日
You would have to create a custom layer. If you are using R2020a version, you'll be able to load the custom layer into the Deep Network Designer from the workspace.
You can refer to the following link for information on creating a custom layer:

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by