RNNs Basics
Respond to the following questions
* Required
Email address
*
Your email
Q1 The figure below shows a RNN with one input unit x, one logistic hidden unit h, and one linear output unit y. The Network parameters are Wxh=-0,1, Whh=0.5 and Why=0.25, hbias=0.4 and ybias=0.0. The input takes the values 18, 9, -8 at time steps 0,1 and 2.
1 point
The RNN is unrolled for time steps T=0,1 and 2
The figure does only shows time propagation but not space propagation
The RNN is not unrolled
Q2 The logistic regression corresponds to the following equation
1 point
f(x)=tanh(x)
f(x)=1/(1+e^(-x))
f(x)=1/(1+e^(x))
None of the above
Q3 Compute the hidden unit value h0
1 point
0.1
0.3
0.2
None of the above
Q4 The output unit values...
0 points
The output unit value y1 is 0.05
The output unit value y2 is 0.2
The output unit value y3 is 0.1
Q5 Consider the following RNN
1 point
The RNN has two input units, two hidden units and one output unit
The RNN has two input units, two time-steps units and one loss unit
The RNN has two input units, two time-steps units and one output unit
Q6 For the image from previous question, every arrow denotes the effect of a variable at time t on a variable at time t+1. Which feed forward Neural Network is equivalent to this network unrolled in time?
1 point
Option 1
Option 2
Option 3
Option None of the above
Q7 Mark all statements that are true
1 point
Vanishing gradients can only be solved by regularization
Exploiding gradients and vanishing gradients can happen when training RNNs with backpropagation through time
The goal of BPTT is to calculate the gradients of the error with respect to the parameters of the RNN which is only the matrix of weights through time
The matrix of weights W in a RNNs is different at each time step
Submit
This content is neither created nor endorsed by Google.
Report Abuse
-
Terms of Service
-
Privacy Policy
Forms