How to use deep learning for interpolation properly

10 ビュー (過去 30 日間)
Alexey Kozhakin
Alexey Kozhakin 2021 年 11 月 29 日
Hi!
What I do wrong? I try to apply deep learning for interpolation function sin. And It learns not good. Insted of sin its just show line function. This is my code
clear all;
clc;
close all;
N=100;
T=2*pi;
x=0:T/(N-1):T;
y=sin(x);
%----- normalization ---------------
y=y+abs(min(y));
y=y./max(y);
x=x./max(x);
x=x;
y=y;
%---------- end normalization -----
plot(x,y,'o')
%------------------------
layers = [
featureInputLayer(1)
fullyConnectedLayer(10)
fullyConnectedLayer(1)
regressionLayer
];
options = trainingOptions('adam', ...
'MaxEpochs',20, ...
'MiniBatchSize',10, ...
'Shuffle','every-epoch', ...
'Plots','training-progress', ...
'Verbose',false);
[net inf] = trainNetwork(x',y',layers,options);
y1 = predict(net,x');
figure(2); hold on;
plot(x,y,'oblack','MarkerSize',20)
plot(x,y1,'x','MarkerSize',20)

採用された回答

Abolfazl Chaman Motlagh
Abolfazl Chaman Motlagh 2021 年 11 月 30 日
Hi,
The Sin(x) function is a complete nonlinear function, on the other hand your network is too simple to handle such a nonlinearity. for overcoming this problem you can:
  • Increase number of Layers
  • Increase number of Parameters
  • add an Activation layer to handle nonlinearity
  • you might also give more iterate to network for learning , maxepoch=2000 since your data and network are small.
here a simple solution for your model:
layers = [
featureInputLayer(1)
fullyConnectedLayer(20);
reluLayer;
fullyConnectedLayer(20);
reluLayer;
fullyConnectedLayer(20);
fullyConnectedLayer(1)
regressionLayer
];
the result is (5 fully connected layer with 20 parameters)
here are some other examples:
(2 fully connected layer each with 200 parameters) ( you can see still not converge correctly)
(3 fully connected layer each with 5 parameters)
so you can change this parameters (even change layers parameters indivisually) to see what is best option for your application.
  6 件のコメント
Abolfazl Chaman Motlagh
Abolfazl Chaman Motlagh 2021 年 12 月 5 日
for first question, here is some example:
N=100;
T=2*pi;
x=0:T/(N-1):T;
y=sin(x);
y=y+abs(min(y));
y=y./max(y);
x=x./max(x);
layers = [
featureInputLayer(1)
fullyConnectedLayer(20);
reluLayer;
fullyConnectedLayer(20);
reluLayer;
fullyConnectedLayer(20);
reluLayer;
fullyConnectedLayer(20);
reluLayer;
fullyConnectedLayer(20);
fullyConnectedLayer(1)
regressionLayer
];
options = trainingOptions('sgdm', ...
'MaxEpochs',600, ...
'MiniBatchSize',10, ...
'Shuffle','every-epoch', ...
'Plots','training-progress', ...
'InitialLearnRate',0.01, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropFactor',0.1, ...
'LearnRateDropPeriod',200, ...
'Verbose',false);
[net inf] = trainNetwork(x',y',layers,options);
y1 = predict(net,x');
plot(x,y,'o','LineWidth',2)
hold on
plot(x,y1,'-','LineWidth',2)
legend('Sin(x)','Predicted')
Abolfazl Chaman Motlagh
Abolfazl Chaman Motlagh 2021 年 12 月 5 日
for second question:
by putting relu function after one layer, all outputs pass over the relu function. so just by adding reluLayer in layers list, you are using this function on every neurons.
there are some standart implementation of other common activation function in matlab, like
  • relu
  • leakyRelu
  • clippedRelu
  • elu
  • tanh
  • swish
but if you want you can define your own activation function using matlab:
functionLayer:
or even complete Deep Network Layer structure and operation:

サインインしてコメントする。

その他の回答 (0 件)

カテゴリ

Help Center および File ExchangeImage Data Workflows についてさらに検索

タグ

製品


リリース

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by