| 1234567891011121314151617181920212223242526272829303132333435363738394041424344454647 |
- function [J, grad] = costFunction(theta, X, y)
- %COSTFUNCTION Compute cost and gradient for logistic regression
- % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
- % parameter for logistic regression and the gradient of the cost
- % w.r.t. to the parameters.
- % Initialize some useful values
- m = length(y); % number of training examples
- % You need to return the following variables correctly
- J = 0;
- grad = zeros(size(theta));
- % ====================== YOUR CODE HERE ======================
- % Instructions: Compute the cost of a particular choice of theta.
- % You should set J to the cost.
- % Compute the partial derivatives and set grad to the partial
- % derivatives of the cost w.r.t. each parameter in theta
- %
- % Note: grad should have the same dimensions as theta
- %
- % refer to the vectorized function in Week 3|Logistic Regression Model|
- % Simplified Cost Function and Gradient Descent
- % h = g(X * theta)
- J = 1 / m * (-1 .* y' * log(sigmoid(X*theta)) - (1 .- y)' * log(1 .- sigmoid(X*theta)));
- % compute the gradient only
- % 1. vector method
- % grad = 1/m * X' * (sigmoid(X*theta) - y);
- %
- % 2. sum method
- grad = 1/m * sum((sigmoid(X*theta) - y ).* X);
- % =============================================================
- end
|