CS 451 Quiz 4
Linear Regression with Multiple Variables
Email address *
Notation and conventions
Note: x_k denotes subscript k; x^k denotes superscript k (or exponentiation). Recall that we use 'm' for the number of training examples, and 'n' for the number of features.
Check all that are true *
1 point
Required
Vectorization
Check all that are true *
Recall that both theta and x are column vectors, and that x' denotes the transpose of x.
1 point
Required
Gradient descent
In the above multivariate gradient descent algorithm, the loop "for j := 0...n" should be executed *
1 point
Feature scaling
Feature scaling guarantees convergence, but may slow down the convergence rate *
1 point
Which are valid ways of performing feature scaling? Check all that apply. *
1 point
Required
Learning rate
If the learning rate alpha is too small, the cost may increase in some iterations *
1 point
Plotting the cost vs. the number of iterations is a good way of checking whether gradient descent converges *
1 point
Polynomial regression
To fit a polynomial to our input data, we can use the standard multivariate linear regression algorithm; we just create additional features, e.g. x2 = x1^2 and x3 = x1^3. *
1 point
Normal equation
Instead of using the iterative gradient descent method, the minimum cost can also be found analytically by solving the normal equation *
1 point
Using the normal equation is the only practical way to minimize J for large n (say, n > 1,000,000) , since gradient descent will be too slow. *
1 point
Submit
This content is neither created nor endorsed by Google. Report Abuse - Terms of Service - Additional Terms