CS 451 Quiz 4
Linear Regression with Multiple Variables
Email address
Notation and conventions
Note: x_k denotes subscript k; x^k denotes superscript k (or exponentiation). Recall that we use 'm' for the number of training examples, and 'n' for the number of features.
Check all that are true
Required
Vectorization
Check all that are true
Recall that both theta and x are column vectors, and that x' denotes the transpose of x.
Required
Gradient descent
In the above multivariate gradient descent algorithm, the loop "for j := 0...n" should be executed
Feature scaling
Feature scaling guarantees convergence, but may slow down the convergence rate
Which are valid ways of performing feature scaling? Check all that apply.
Required
Learning rate
If the learning rate alpha is too small, the cost may increase in some iterations
Plotting the cost vs. the number of iterations is a good way of checking whether gradient descent converges
Polynomial regression
To fit a polynomial to our input data, we can use the standard multivariate linear regression algorithm; we just create additional features, e.g. x2 = x1^2 and x3 = x1^3.
Normal equation
Instead of using the iterative gradient descent method, the minimum cost can also be found analytically by solving the normal equation
Using the normal equation is the only practical way to minimize J for large n (say, n > 1,000,000) , since gradient descent will be too slow.
Submit
This content is neither created nor endorsed by Google. Report Abuse - Terms of Service - Additional Terms