Grey Wolf Optimizer for Training Multi-Layer Perceptrons

The submission employs the recently proposed Grey Wolf Optimizer for training Multi-Layer Perceptron
ダウンロード: 2.6K
更新 2018/5/22

ライセンスの表示

Grey Wolf Optimizer (GWO) is employed as a trainer for Multi-Layer Perceptron (MLP). The current source codes are the demonstration of the GWO trainer for solving the "Iris" classification problem.
This is the demonstration source codes of the paper:
S. Mirjalili, How effective is the GreyWolf optimizer in training multi-layer perceptrons, Applied Intelligence, In press, 2015, DOI: http://dx.doi.org/10.1007/s10489-014-0645-7

More information can be found in my personal web page: http://www.alimirjalili.com

I have a number of relevant courses in this area. You can enrol via the following links with 95% discount:

*******************************************************************************************************************************************
A course on “Optimization Problems and Algorithms: how to understand, formulation, and solve optimization problems”:
https://www.udemy.com/optimisation/?couponCode=MATHWORKSREF

A course on “Introduction to Genetic Algorithms: Theory and Applications”
https://www.udemy.com/geneticalgorithm/?couponCode=MATHWORKSREF
*******************************************************************************************************************************************

引用

Seyedali Mirjalili (2024). Grey Wolf Optimizer for Training Multi-Layer Perceptrons (https://www.mathworks.com/matlabcentral/fileexchange/49772-grey-wolf-optimizer-for-training-multi-layer-perceptrons), MATLAB Central File Exchange. 取得済み .

MATLAB リリースの互換性
作成: R2011b
すべてのリリースと互換性あり
プラットフォームの互換性
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
バージョン 公開済み リリース ノート
1.2

Typo in the description
Links added:
https://www.udemy.com/optimisation/?couponCode=MATHWORKSREF
https://www.udemy.com/geneticalgorithm/?couponCode=MATHWORKSREF

1.1.0.0

The paper has been added to the description

1.0.0.0