Quick Check 20.4 – Data 100, Summer 2020
* Required
The complexity of decision trees limits their use in many situations. However, modifications can be implemented to mitigate this. Which of the following are reasonable examples of such modifications?
*
Restricting the depth
Not splitting nodes containing small fractions of the observations
Increasing the size of the training set
Pruning the nodes of a fully grown tree
Required
To train a Random Forest regression model, fit a ___ to each of several ___ and select features at random for consideration at each node.
*
linear regression model, randomly sampled training sets
regression tree, randomly sampled training sets
linear regression model, bootstrap samples of the training set
regression tree, bootstrap samples of the training set
The main idea behind ensemble prediction methods such as Random Forests is that aggregating predictions from predictors applied to multiple bootstrap samples of the training set:
*
reduces bias
reduces model variance
reduces both bias and model variance
none of these
Submit
This form was created inside of UC Berkeley.
Report Abuse
Forms