learningCurve.m 2.6 KB

12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273
  1. function [error_train, error_val] = ...
  2. learningCurve(X, y, Xval, yval, lambda)
  3. %LEARNINGCURVE Generates the train and cross validation set errors needed
  4. %to plot a learning curve
  5. % [error_train, error_val] = ...
  6. % LEARNINGCURVE(X, y, Xval, yval, lambda) returns the train and
  7. % cross validation set errors for a learning curve. In particular,
  8. % it returns two vectors of the same length - error_train and
  9. % error_val. Then, error_train(i) contains the training error for
  10. % i examples (and similarly for error_val(i)).
  11. %
  12. % In this function, you will compute the train and test errors for
  13. % dataset sizes from 1 up to m. In practice, when working with larger
  14. % datasets, you might want to do this in larger intervals.
  15. %
  16. % Number of training examples
  17. m = size(X, 1);
  18. % You need to return these values correctly
  19. error_train = zeros(m, 1);
  20. error_val = zeros(m, 1);
  21. % ====================== YOUR CODE HERE ======================
  22. % Instructions: Fill in this function to return training errors in
  23. % error_train and the cross validation errors in error_val.
  24. % i.e., error_train(i) and
  25. % error_val(i) should give you the errors
  26. % obtained after training on i examples.
  27. %
  28. % Note: You should evaluate the training error on the first i training
  29. % examples (i.e., X(1:i, :) and y(1:i)).
  30. %
  31. % For the cross-validation error, you should instead evaluate on
  32. % the _entire_ cross validation set (Xval and yval).
  33. %
  34. % Note: If you are using your cost function (linearRegCostFunction)
  35. % to compute the training and cross validation error, you should
  36. % call the function with the lambda argument set to 0.
  37. % Do note that you will still need to use lambda when running
  38. % the training to obtain the theta parameters.
  39. %
  40. % Hint: You can loop over the examples with the following:
  41. %
  42. % for i = 1:m
  43. % % Compute train/cross validation errors using training examples
  44. % % X(1:i, :) and y(1:i), storing the result in
  45. % % error_train(i) and error_val(i)
  46. % ....
  47. %
  48. % end
  49. %
  50. % ---------------------- Sample Solution ----------------------
  51. for i = 1:m
  52. X_train = X(1:i,:);
  53. y_train = y(1:i,:);
  54. theta = trainLinearReg(X_train, y_train, lambda);
  55. [error_train(i,:), grad] = mylinearRegCostFunction(X_train, y_train, theta, lambda);
  56. [error_val(i,:), grad] = mylinearRegCostFunction(Xval, yval, theta, lambda);
  57. endfor
  58. % -------------------------------------------------------------
  59. % =========================================================================
  60. end