error gradObj unrecognized parameter

iam getting this on my command window
Please correct your code and resubmit.
which -all optimset
/opt/mlsedu/matlab/R2019a/toolbox/matlab/optimfun/optimset.m
now can anyone help me so solve the issue.

8 件のコメント

Geoff Hayes
Geoff Hayes 2019 年 7 月 5 日
Kavidha - you have posted (roughly) the same question a half dozen times over the last 2-3 days. Why not just further the convseration with your earlies post? Why don't you follow up to any of the comments on those questions? Can you explain where you are submitting your code?
Kavidha Krishnamoorthi
Kavidha Krishnamoorthi 2019 年 7 月 5 日
IAM submitting my assignments given by Coursera for machine learning .I have completed all the assignments.except one where I am getting a error .
Geoff Hayes
Geoff Hayes 2019 年 7 月 5 日
What is the line of code that is generating the error? Please copy and paste (to this question) the line or lines of code. Also, please copy and paste (to this question) the complete error message (that which includes the 'gradObj' error). And are you calling
which -all optimset
or is that being returned in the error. Which version of MATLAB are you using (on your computer) and which version of MATLAB is Coursera using?
Kavidha Krishnamoorthi
Kavidha Krishnamoorthi 2019 年 7 月 5 日
i dont know which version of MATLAB IS coursea using . i did registered for matlab programing so i got a licence for 119 days to practice online and i used it to submit my machine learning assignments . i even asked help regarding the error coursera mentors they told me to post my questions on MATLAB discussion forum .
Kavidha Krishnamoorthi
Kavidha Krishnamoorthi 2019 年 7 月 5 日
編集済み: Geoff Hayes 2019 年 7 月 5 日
for Programming Assignment: Regularized Linear Regression and Bias/Variance
iam getting error for two programs
learningcurve.m
function [error_train, error_val] = ...
learningCurve(X, y, Xval, yval, lambda)
%LEARNINGCURVE Generates the train and cross validation set errors needed
%to plot a learning curve
% [error_train, error_val] = ...
% LEARNINGCURVE(X, y, Xval, yval, lambda) returns the train and
% cross validation set errors for a learning curve. In particular,
% it returns two vectors of the same length - error_train and
% error_val. Then, error_train(i) contains the training error for
% i examples (and similarly for error_val(i)).
%
% In this function, you will compute the train and test errors for
% dataset sizes from 1 up to m. In practice, when working with larger
% datasets, you might want to do this in larger intervals.
%
% Number of training examples
m = size(X, 1);
% You need to return these values correctly
error_train = zeros(m, 1);
error_val = zeros(m, 1);
% ====================== YOUR CODE HERE ======================
% Instructions: Fill in this function to return training errors in
% error_train and the cross validation errors in error_val.
% i.e., error_train(i) and
% error_val(i) should give you the errors
% obtained after training on i examples.
%
% Note: You should evaluate the training error on the first i training
% examples (i.e., X(1:i, :) and y(1:i)).
%
% For the cross-validation error, you should instead evaluate on
% the _entire_ cross validation set (Xval and yval).
%
% Note: If you are using your cost function (linearRegCostFunction)
% to compute the training and cross validation error, you should
% call the function with the lambda argument set to 0.
% Do note that you will still need to use lambda when running
% the training to obtain the theta parameters.
%
% Hint: You can loop over the examples with the following:
%
% for i = 1:m
% % Compute train/cross validation errors using training examples
% % X(1:i, :) and y(1:i), storing the result in
% % error_train(i) and error_val(i)
% ....
%
% end
%
% ---------------------- Sample Solution ----------------------
% linearRegCostFunction(X, y, theta, lambda)
for i = 1:m,
X_train = X(1:i, :);
y_train = y(1:i);
theta = trainLinearReg(X_train, y_train, lambda);
error_train(i) = linearRegCostFunction(X_train, y_train, theta, 0);
error_val(i) = linearRegCostFunction(Xval, yval, theta, 0);
end
% -------------------------------------------------------------
%
=========================================================================
end
WHEN I RUN THIS PROGRAM IAM GETTING
Not enough input arguments.
Error in learningCurve (line 19)
m = size(X, 1);
while submitting aim getting
In submit (line 35)
Warning: File: submitWithConfiguration.m Line: 66 Column: 3
"submissionUrl" is used as a function or command and then as a variable name.
Using "submissionUrl" as both a function and a variable name in the same scope will error in future release.
> In submit (line 35)
Warning: Function Warning: Name is nonexistent or not a directory: /MATLAB Drive/./lib/jsonlab
> In path (line 109)
In addpath (line 86)
In addpath (line 47)
In submitWithConfiguration (line 2)
In submit (line 35)
== Submitting solutions | Regularized Linear Regression and Bias/Variance...
Use token from last successful submission (kavidhakrishan2323@gmail.com)? (Y/n):
Y
!! Submission failed: Error using optimset (line 255)
Unrecognized parameter name 'GradObj'. Please see the optimset reference page in the documentation for a list of acceptable option parameters. Link to reference page.
Function: optimset
FileName: /opt/mlsedu/matlab/R2019a/toolbox/matlab/optimfun/optimset.m
LineNumber: 255
Please correct your code and resubmit.
which -all optimset
/opt/mlsedu/matlab/R2019a/toolbox/matlab/optimfun/optimset.m
ver
------------------------------------------------------------------------------------------------------
MATLAB Version: 9.6.0.1131991 (R2019a) Update 3
MATLAB License Number:
Operating System: Linux 4.14.121-0414121-generic #201905211331 SMP Tue May 21 17:34:21 UTC 2019 x86_64
Java Version: Java 1.8.0_181-b13 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
------------------------------------------------------------------------------------------------------
MATLAB Version 9.6 (R2019a)
Geoff Hayes
Geoff Hayes 2019 年 7 月 5 日
How are you calling learningCurve? Are you passing in all the required parameters to this function? Where is the line of code that uses GradObj? Have you tried not using this parameter?
And are you calling ver and which -all optimset or is that a result of the project submission?
Kavidha Krishnamoorthi
Kavidha Krishnamoorthi 2019 年 7 月 5 日
for each assignment we have to use different folder which contains its own lib and submit code .
GRADOBJ was used in the third assignment it has a different set of lib folder and other files . the following is the code from 3rd assignment .
for Multi - class classification and neural networks assignment we have to use GRADOBJ.
oneVSall.m
function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta
%corresponds to the classifier for label i
% [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
% logistic regression classifiers and returns each of these classifiers
% in a matrix all_theta, where the i-th row of all_theta corresponds
% to the classifier for label i
% Some useful variables
m = size(X, 1);
n = size(X, 2);
% You need to return the following variables correctly
all_theta = zeros(num_labels, n + 1);
% Add ones to the X data matrix
X = [ones(m, 1) X];
% ====================== YOUR CODE HERE ======================
% Instructions: You should complete the following code to train num_labels
% logistic regression classifiers with regularization
% parameter lambda.
%
% Hint: theta(:) will return a column vector.
%
% Hint: You can use y == c to obtain a vector of 1's and 0's that tell you
% whether the ground truth is true/false for this class.
%
% Note: For this assignment, we recommend using fmincg to optimize the cost
% function. It is okay to use a for-loop (for c = 1:num_labels) to
% loop over the different classes.
%
% fmincg works similarly to fminunc, but is more efficient when we
% are dealing with large number of parameters.
%
% Example Code for fmincg:
%
% % Set Initial theta
% initial_theta = zeros(n + 1, 1);
%
% % Set options for fminunc
% options = optimset('GradObj', 'on', 'MaxIter', 50);
%
% % Run fmincg to obtain the optimal theta
% % This function will return theta and the cost
% [theta] = ...
% fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
% initial_theta, options);
%
for c = 1:num_labels
initial_theta = zeros(n + 1, 1);
options = optimset('GradObj', 'on', 'MaxIter', 50);
[theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), initial_theta, options);
all_theta(c, :) = theta';
end;
% =========================================================================
end
FOR MY 5TH ASSIGNMENT WE HAVE A DIFFERENT SET OF LIB FOLDER AND DIFFERENT SET OF FILES . there is no code in the 5th assignment which uses optimset.
according to me each assignment has a different setof lib folder and files to execute program and to submit.
so i dont understand why iam getting a error in 5th assignment while executing files here i have not used GRADOBJ in any code .
i dont know how online matlab works . it has a storage space for saving online files . now based on my submission of answers of each assignment it must have saved all the executed files in its memory otherwise it wont be able retain scores for each assignment .
maybe thats the reason iam getting a error
because for each assignment based on separate lib folder and files i should not get this error.
Kavidha Krishnamoorthi
Kavidha Krishnamoorthi 2019 年 7 月 5 日
I think it's a technical error and maybe by adding proper license of toolbox I might be not be getting this error

サインインしてコメントする。

回答 (0 件)

カテゴリ

ヘルプ センター および File ExchangeRandom Number Generation についてさらに検索

タグ

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by