Setting up a 3 layered back-propagation neural network

4 ビュー (過去 30 日間)
stayfrosty
stayfrosty 2016 年 6 月 28 日
編集済み: stayfrosty 2016 年 7 月 6 日
I'm trying to set up a neural network with the following requirements -
  • three-layered;
  • feed-forward;
  • classical tan-sigmoid and linear functions in the hidden and output layers, respectively;
  • 5 neurons in the hidden layer;
  • trained with the Levenberg-Marquardt back-propagation algorithm
  • converges in 5 iterations
Basically, the neural network is to be trained by giving an RGB map input (3 values) and target output skin parameters (3 values). I've tried using the 'nntool' Matlab wizard and but am unsure if 'nftool' is the one I'm looking for. Unsure, because it says it's 2 layered and there's no option to make it converge in 5 iterations.
I'm new to setting up neural networks as that really isn't the main focus of my project. My question is - is the 'nftool' wizard the thing I'm after and are there settings in it that meet my listed neural network requirements? If not, is there some sort of coding template I can alter to create and train my neural network?
  2 件のコメント
José-Luis
José-Luis 2016 年 6 月 28 日
編集済み: José-Luis 2016 年 6 月 28 日
How could you make it converge in five iterations? Convergence is not something you can impose. On the other hand, you could try and make it quit after five iterations.
stayfrosty
stayfrosty 2016 年 6 月 28 日
Well, I'm trying to reproduce the results of a study and it did say the network "converges in 5 iterations". All I'm attempting to do is reproduce the results according to the details given. I guess, apart from that detail, is the 'nftool' the Matlab tool I should be using?

サインインしてコメントする。

回答 (1 件)

Greg Heath
Greg Heath 2016 年 7 月 2 日
Only hidden and output nodes are considered being in neuron layers because they are associated with non-identity transfer functions. Input nodes are only considered to be fan-in units, not neurons. Therefore, although there are three layers of nodes, you have a two-layer network because there are only two layers of neurons.
You cannot duplicate designs without knowing the random number seed from which random initial weights and random trn/val/tst data division are obtained.
If you want to use the GUI, the fitting tool nfttool is appropriate.
However, I prefer the command line approach similar to the examples in the HELP and DOC documentation and the zillions of examples I have posted in both the NEWSGROUP & ANSWERS.
help fitnet
doc fitnet
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 件のコメント
stayfrosty
stayfrosty 2016 年 7 月 6 日
編集済み: stayfrosty 2016 年 7 月 6 日
Thank you for your reply. Could you point me to examples which you think are relevant to my dilemma? Here is a video link to the study whose neural network I was initially trying to replicate.
When I first undertook this as a project, I didn't expect to have to mess around with neural networks. Unfortunately, to reproduce/replicate the findings of this study, it looks like a must do.

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeDeep Learning Toolbox についてさらに検索

製品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by