Lecture 11 Quiz
Linear regression continued
* Required
Andrew ID
*
Your answer
Which of the following are false about alpha in gradient descent?
*
1 point
We should have larger values of alpha so we can reach the optimal solution in few iterations.
Divergence is likely with larger values of alpha
Alpha can be a negative in gradient descent
None of the above
Which of the following are true about linear regression?
*
1 point
Adding more features always lead to better regression model.
Linear regression aims to pass through maximum number of data points
We can use Linear Regression to generate nonlinear functions of features.
None of the above
Which of the following options makes sense?
*
1 point
Computing closed form solutions can be computationally expensive due to matrix operations.
Gradient descent is preferred over normal equations for smaller dataset
Normal equations are always best to use because it returns the optimal value of theta.
All the above.
Submit
Never submit passwords through Google Forms.
This form was created inside of Carnegie Mellon University.
Report Abuse

Terms of Service
Forms