fminunc in an endless loop

Hi all,
I would really appreciate some help. I have written a neural network model and tried to keep it as flexible as possible. The model works perfectly (tested with test cases from different online courses) for a binary classification with and without the optimization function fminunc. It also works (and is tested) for a multi-class classification if I do NOT use fminunc. However, if I implement fminunc, the cost, which I try to minimize, stays put at the first value and the whole program runs into an endless loop. I have also tried to feed fminunc in addition to the parameters (the sought after output) with the flattened gradient - but this does not help either. The cost stays at the initial value.
Any ideas what I am doing wrong?
Here is my code:
%
% Reshape data so that examples are in columns, features in rows
X = X';
y = double( one_hot(y'));
%
% Set up layer dimenstions
layer_dims = [400,25,10];
activ_hidden = 'sigmoid'; % Sigmoid used in test case
activ_out = 'sigmoid'; % Sigmoid used in test case
maxIter = 50; % 50 used in test case
lambda = 0; % No L2-regularization in test case
keep_prob = 1; % No dropout-regularization in test case
learning_rate = 1; % Required model argument but not needed in fminunc, hence set to 1
%
% Set up initial parameters with seed from test case
parameters = rand_u_init(layer_dims);
%
% Reshape parameters
parameters_flat = flatten_params(parameters, layer_dims);
%
% Set up options for fminunc
options = optimset('MaxIter', maxIter);
%
% Create short hand for the cost function to be minimized
cost_function = @(p) L_layer_multi_optimized_model_for_testing(...
p, ...
X,...
y, ...
layer_dims, ...
activ_hidden, ...
activ_out, ...
lambda, ...
keep_prob, ...
learning_rate);
%
% Run optimization algorithm fminunc
[parameters_flat, cost] = ...
fminunc(cost_function, parameters_flat, options);
%
% Reshape parameters
parameters = reshape_params(parameters_flat, layer_dims);
As I say, the model runs perfectly well without fminunc - and I can not find the mistake. Any help is really appreciated,
thanks a lot in advance, cheers Wolfgang

回答 (1 件)

Alan Weiss
Alan Weiss 2017 年 10 月 27 日

0 投票

While I do not really understand what you are trying to do, perhaps you need to set fminunc options to use larger finite differences than the defaults.
Alan Weiss
MATLAB mathematical toolbox documentation

1 件のコメント

Wolfgang Reuter
Wolfgang Reuter 2017 年 10 月 28 日
Hi Alan,
thanks a lot for your suggestion - unfortunately it did not solve the problem. What I am trying to do is just setting up my own neural network model with choices of activations, initializations and regularizations and run it with fminunc for better performance.

サインインしてコメントする。

カテゴリ

ヘルプ センター および File ExchangeProblem-Based Optimization Setup についてさらに検索

質問済み:

2017 年 10 月 27 日

コメント済み:

2017 年 10 月 28 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by