# GradientDescent algorithm returning error: left and right side different number of elements

2 ビュー (過去 30 日間)
Jeani de Jongh 2019 年 4 月 11 日
コメント済み: Brendan Hamm 2019 年 4 月 11 日
I am currently busy with an online assignment doing GradientDescent. I keep getting an error message that the code (which was given to us to save the cost of J) cannot be run because the sides have a different number of elements. I have not edited this though, so I know there must be a problem with my code. Just don't know what's wrong.
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1); %
for iter = 1:num_iters
% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCost) and gradient here.
%
hypothesis= X * theta;
error = (hypothesis - y).^2;
theta = theta - alpha * 1/2*m * sum(error * X(:,2));
% ===========================================================
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
I keep getting the error that
J_history(iter) = computeCost (X, y, theta)
have a different number of elements.
EDIT: Here's the computeCost:
function J = computeCost(X,y,theta)
%COMPUTECOST Compute cost for linear regression
% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
hypothesis = (X * theta);
error = ((hypothesis - y).^2);
J = 1/(2*m)*sum(error);
% =========================================================================
end

サインインしてコメントする。

### 回答 (1 件)

Brendan Hamm 2019 年 4 月 11 日
I would put a break point on the line:
error = ((hypothesis - y).^2);
I would assume that the size of hypothesis and y are different (one row vector the other column), leading to implicit expansion. That is a row vector minus a column vector (or vice versa) returns a matrix, and thus sum(matrix) -> row vector.
If this is the case, youshould only need to transpose one of the variables:
error = ((hypothesis.' - y).^2);
##### 3 件のコメント1 件の古いコメントを表示1 件の古いコメントを非表示
Brendan Hamm 2019 年 4 月 11 日
If these were indeed the sizes of the arrays, then you would error on this line:
theta = theta - alpha * 1/2*m * sum(error * X(:,2));
That is error is 97x1 and X(:,2) is 97x1, so you cannot have a matrix multiplication (error * X(:,2)).
So, use breakpoints (possibly conditional breakpoints) to determine what the actual size of these arrays are at the time of the error and test it out at the command line in debug mode.
The error message should tell you exactly the line on which the error occurs, possibly in multiple files, but the one you should be concerned with is the line in your code (either or both gradientDescent and computeCost). Put the breakpoint on that line and inspect your workspace, this should give clues as to where your assumptions are not what is actually coded.
Brendan Hamm 2019 年 4 月 11 日
Actually, looking at this in more detail, you are not calculating the gradient of the Cost function correctly. I would suggest looking at the formula in more detail and ensure you understand it. I hesitate to give the answer as this would destroy your learning experience ... but hopefully this leads you on the right track.
Specifically, this is the part that is not right:
1/2*m * sum(error * X(:,2));

サインインしてコメントする。

### カテゴリ

Help Center および File ExchangeVariables についてさらに検索

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by