フィルターのクリア

How to do activation quantization of EACH LAYER in lgraph or dlnetwork?

2 ビュー (過去 30 日間)
Runcong Kuang
Runcong Kuang 2023 年 5 月 8 日
コメント済み: Runcong Kuang 2023 年 5 月 8 日
for activation of each layer in dlnetwork or lgraph, how to do quantization?
My dlnetwork is just a piece of network, not a whole network till the end.
My dlnetworks contains 3 layers of Conv2d, no classification layer.
I only use it to do inference.
So no training-aware quantization is needed.
Only post training quantization is needed.

回答 (1 件)

Matt J
Matt J 2023 年 5 月 8 日
  1 件のコメント
Runcong Kuang
Runcong Kuang 2023 年 5 月 8 日
Thanks. I thought this is for weight, bias. I just saw it also applies to activation. Good. But I failed to use this function because of some compiler problem even if I use 'matlab' as environment. So I have to manually quantize weights and biases to put into each layers, but can't do the same with activations.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeQuantization, Projection, and Pruning についてさらに検索

製品


リリース

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by