Using tapped delays in the process of training an artificial neural network for the purpose of dynamic system modeling.
1 回表示 (過去 30 日間)
古いコメントを表示
Hello,I'm trying to use an artificial neural network for creating a model for the system : ![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390249/image.png)
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390249/image.png)
from what I have got so far first I should get the response of the system to an arbitrary input and use that data for training my network. after discretisation this second order diffrence equation (
) would describe the dynamic behaviour of system.
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1390254/image.png)
Now what is the method to implement this delays in my neural network? should I just give these delayed outputs and inputs of the plant as a input to my neural network or is there an easier way to achive this?
0 件のコメント
回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Sequence and Numeric Feature Data Workflows についてさらに検索
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!