CS 451 Quiz 14
Support Vector Machines
* Required
Email address
*
Your email
The cost function used in SVMs is similar to the cost function in logistic regression, but instead of using the log function
*
it is piecewise linear
it is quadratic
it uses the tan function
In logistic regression, we have a parameter lambda controlling the amount of regularization. In SVMs, what do we have instead?
*
A parameter C that acts like lambda
A parameter C that acts like 1 / lambda
SVMs don't require regularization
While the hypothesis of logistic regression can be interpreted as a probability (a real number between 0 and 1), the hypothesis of an SVM is always either 0 or 1.
*
True
False
Support vector machines are also know as
*
large margin classifiers
small margin classifiers
In the context of SVMs, "margin" refers to
*
the margin of error when classifying training examples
the marginal distribution of the misclassified training examples
the distance from the decision boundary to the nearest training examples
If we have data that is not linearly separable, then SVMs cannot be employed successfully
*
True
False
For any two nx1 vectors u and v, u' * v = v' * u (where u' denotes the transpose of u)
*
True
False
Which spelling mistake does Andrew Ng make in one of the videos?
*
Intiution
Simplication
Marge largin
To gain intuition about SVM optimization we consider the projections p(i) of the data points onto the feature vector theta and show that
*
we are minimizing the length of the vector theta, while enforcing a minimum length of the projections
we are maximizing the length of the projections while keeping the length of the vector theta constant
I'd enjoy a free point on this quiz
*
True
False
Submit
This content is neither created nor endorsed by Google.
Report Abuse

Terms of Service

Additional Terms
Forms