Mixtures of Experts, Using Gaussian Mixture Models for the Gate

This code implements the mixture of expert’s using a Gaussian mixture model for the gate.
ダウンロード: 692
更新 2014/11/11

ライセンスの表示

This code implements using a Gaussian mixture model for the gate. ; the main advantage of this method is that training for the gate uses expected maximization (EM) algorithm or single loop EM algorithm. This is achieved using a Gaussian mixture model for the gate. Other methods use the Softmax Function that does not have an analytically closed form solution, requiring the Generalized Expectation Maximization (GEM) or the double loop EM algorithm. The problems with GEM is that it requires extra computation and the stepsize must be chosen carefully to guarantee the convergence of the inner loop. I used k means clustering for initialization, I find only a small improvement after initialization. If you have any questions or recommendations contact me.

引用

Joseph Santarcangelo (2024). Mixtures of Experts, Using Gaussian Mixture Models for the Gate (https://www.mathworks.com/matlabcentral/fileexchange/48367-mixtures-of-experts-using-gaussian-mixture-models-for-the-gate), MATLAB Central File Exchange. 取得済み .

MATLAB リリースの互換性
作成: R2008a
すべてのリリースと互換性あり
プラットフォームの互換性
Windows macOS Linux
カテゴリ
Help Center および MATLAB AnswersStatistics and Machine Learning Toolbox についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
バージョン 公開済み リリース ノート
1.2.0.0

din't upload last time

1.1.0.0

There was an error in the first version, I also improved documentation

1.0.0.0