lrCostFunction.m 1.8 KB

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556
  1. function [J, grad] = lrCostFunction(theta, X, y, lambda)
  2. %LRCOSTFUNCTION Compute cost and gradient for logistic regression with
  3. %regularization
  4. % J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
  5. % theta as the parameter for regularized logistic regression and the
  6. % gradient of the cost w.r.t. to the parameters.
  7. % Initialize some useful values
  8. m = length(y); % number of training examples
  9. % You need to return the following variables correctly
  10. z = hypothesis(theta, X);
  11. t = lambda*(sum(theta .^ 2)-theta(1)^2)/2/m;
  12. J = mean(- y .* log(z) + (y - 1) .* log(1 - z)) + t;
  13. grad = mean((z - y) .* X)' + lambda /m * theta;
  14. grad(1) = grad(1) - lambda /m * theta(1);
  15. % ====================== YOUR CODE HERE ======================
  16. % Instructions: Compute the cost of a particular choice of theta.
  17. % You should set J to the cost.
  18. % Compute the partial derivatives and set grad to the partial
  19. % derivatives of the cost w.r.t. each parameter in theta
  20. %
  21. % Hint: The computation of the cost function and gradients can be
  22. % efficiently vectorized. For example, consider the computation
  23. %
  24. % sigmoid(X * theta)
  25. %
  26. % Each row of the resulting matrix will contain the value of the
  27. % prediction for that example. You can make use of this to vectorize
  28. % the cost function and gradient computations.
  29. %
  30. % Hint: When computing the gradient of the regularized cost function,
  31. % there're many possible vectorized solutions, but one solution
  32. % looks like:
  33. % grad = (unregularized gradient for logistic regression)
  34. % temp = theta;
  35. % temp(1) = 0; % because we don't add anything for j = 0
  36. % grad = grad + YOUR_CODE_HERE (using the temp variable)
  37. %
  38. % =============================================================
  39. grad = grad(:);
  40. end