costFunction.m 1.3 KB

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647
  1. function [J, grad] = costFunction(theta, X, y)
  2. %COSTFUNCTION Compute cost and gradient for logistic regression
  3. % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
  4. % parameter for logistic regression and the gradient of the cost
  5. % w.r.t. to the parameters.
  6. % Initialize some useful values
  7. m = length(y); % number of training examples
  8. % You need to return the following variables correctly
  9. J = 0;
  10. grad = zeros(size(theta));
  11. % ====================== YOUR CODE HERE ======================
  12. % Instructions: Compute the cost of a particular choice of theta.
  13. % You should set J to the cost.
  14. % Compute the partial derivatives and set grad to the partial
  15. % derivatives of the cost w.r.t. each parameter in theta
  16. %
  17. % Note: grad should have the same dimensions as theta
  18. %
  19. % refer to the vectorized function in Week 3|Logistic Regression Model|
  20. % Simplified Cost Function and Gradient Descent
  21. % h = g(X * theta)
  22. J = 1 / m * (-1 .* y' * log(sigmoid(X*theta)) - (1 .- y)' * log(1 .- sigmoid(X*theta)));
  23. % compute the gradient only
  24. % 1. vector method
  25. % grad = 1/m * X' * (sigmoid(X*theta) - y);
  26. %
  27. % 2. sum method
  28. grad = 1/m * sum((sigmoid(X*theta) - y ).* X);
  29. % =============================================================
  30. end