Login using Social Account
Login using your credentials
Taking you to the next exercise in seconds...
Stay here Next Exercise
Want to create exercises like this yourself? Click here.
Error
1 Recurrent Neural Networks - Introduction to RNN
2 MCQ - RNN cannot work on sequences of arbitrary lengths of input
3 MCQ - RNN can analyze time series data such as stock prices
4 MCQ - RNN looks much like a feedforward neural network except it has connections pointing backward
5 MCQ - A part of a neural network that preserves some state across time steps is called a memory cell
6 Recurrent Neural Networks - Forecasting a Time Series
7 MCQ - In an encoder-decoder network, we have sequence-to-vector network, called an decoder followed by vector-to-sequence network, called a encoder
8 MCQ - To train an RNN, the trick is to unroll it through time and then simply use regular backpropagation. This strategy is called backpropagation through time (BPTT).
9 MCQ - When the data is a sequence of one or more values per time step it is called a time series
10 MCQ - Univariate time series involves two or more input variables
11 MCQ - The task to predict missing values from the past is called imputation
12 Recurrent Neural Networks - Forecasting several Time Steps ahead
13 MCQ - What is/are the option(s) to predict 10 time steps ahead using an RNN?
14 Recurrent Neural Networks - Handling Long Sequences
15 MCQ - The Long Short-Term Memory (LSTM) cell was proposed by Sepp Hochreiter and Jürgen Schmidhuber in the year
16 MCQ - The Gated Recurrent Unit (GRU) cell was proposed by Kyunghyun Cho et al. in the year
17 MCQ - GRU can be implemented in Keras using the keras.layers.GRU layer
Loading comments...