coding structure of gaussian noise layer

2 ビュー (過去 30 日間)
jianY xu
jianY xu 2018 年 9 月 18 日
回答済み: Jack Xiao 2021 年 2 月 22 日
I want to create a special layer to add some special noise to the data.
But my matlab version is 2017b, I don't have the example " gaussianNoiseLayer.m".
That file should be located at (matlabroot, 'examples', 'nnet', 'main', 'gaussianNoiseLayer.m') in the matlab 2018b or 2018a version.
I really want to know the coding structure of adding noise layer.
If any kind-hearted person has installed the latest version of matlab, can you send a copy of this file to me?
email: xjy1236@sina.com thank you very much!!
  1 件のコメント
MAHSA YOUSEFI
MAHSA YOUSEFI 2021 年 1 月 4 日
Hi Jian.
Did you solve your problem with adding noise?
I want to add Gaussian noide per each layer of hidden layer and input in my costumized training loop.

サインインしてコメントする。

回答 (1 件)

Jack Xiao
Jack Xiao 2021 年 2 月 22 日
here is the code:
classdef gaussianNoiseLayer < nnet.layer.Layer
% gaussianNoiseLayer Gaussian noise layer
% A Gaussian noise layer adds random Gaussian noise to the input.
%
% To create a Gaussian noise layer, use
% layer = gaussianNoiseLayer(sigma, name)
properties
% Standard deviation.
Sigma
end
methods
function layer = gaussianNoiseLayer(sigma, name)
% layer = gaussianNoiseLayer(sigma,name) creates a Gaussian
% noise layer and specifies the standard deviation and layer
% name.
layer.Name = name;
layer.Description = ...
"Gaussian noise with standard deviation " + sigma;
layer.Type = "Gaussian Noise";
layer.Sigma = sigma;
end
function Z = predict(layer, X)
% Z = predict(layer, X) forwards the input data X through the
% layer for prediction and outputs the result Z.
% At prediction time, the output is equal to the input.
Z = X;
end
function [Z, memory] = forward(layer, X)
% Z = forward(layer, X) forwards the input data X through the
% layer and outputs the result Z.
% At training time, the layer adds Gaussian noise to the input.
sigma = layer.Sigma;
noise = randn(size(X)) * sigma;
Z = X + noise;
memory = [];
end
function dLdX = backward(layer, X, Z, dLdZ, memory)
% [dLdX, dLdAlpha] = backward(layer, X, Z, dLdZ, memory)
% backward propagates the derivative of the loss function
% through the layer.
% Since the layer adds a random constant, the derivative dLdX
% is equal to dLdZ.
dLdX = dLdZ;
end
end
end

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

製品


リリース

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by