A Suitable Machine Learning Technique to Learn Y=f(X,t)
2 ビュー (過去 30 日間)
古いコメントを表示
I have a blackbox model that accepts an input vector X (variables) and gives three outputs Ys but as a function of time Y1(t), Y2(t) and Y3(t). In the outputs "t" is the discrete time with a known number of time steps. The model is a simulator that predicts the output quantities as a function of time. Therefore Y1(t1), Y1(t2),... are not independent. Y1(t), Y2(t) and Y3(t) can also have some relationships but for now we can ignore that.
I have several instances (samples) of X with their corresponding outputs. Which machine learning technique can handle this learning process to relate X with Y(t) ?
I am a bit confused because I have always seen the machine learning algorithm to relate X with one output Y which is not time dependent. On the other hand the time series prediction methods only look at Y=f(t) and not the X (i.e. the input is t and the output is Y)
Any suggestion to a specific method is highly appreciated
0 件のコメント
採用された回答
Ameer Hamza
2020 年 5 月 6 日
Yes, common neural networks are not well-suited for time-series data. Although you can use them by defining multiple inputs (say n), where each corresponding to a time step value t(n), t(n-1), t(n-2), ..., t(1), but that is not a commonly used way.
For time-dependent series, mainly we use LSTM networks:
See MATLAB examples here:
You can also see the Recurrent Neural networks which are a general form of LTSM.
0 件のコメント
その他の回答 (0 件)
参考
カテゴリ
Help Center および File Exchange で Sequence and Numeric Feature Data Workflows についてさらに検索
製品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!