438 views
0 0 votes

Consider a simplified Recurrent Neural Network (RNN) with a single input and a single output. The hidden state is updated using the recurrence:

$$ h_t = \text{ReLU}(W_{ih} \cdot x_t + W_{hh} \cdot h_{t-1}) $$

Assume the following:

  • \( x_t = 3 \) for every time step
  • \( h_0 = 0 \)
  • \( W_{ih} = 0.4 \)
  • \( W_{hh} = 0.6 \)
  • Activation function: ReLU

Compute the value of the hidden state \( h_4 \) at time \( t = 4 \).

50% Accept Rate Accepted 31 answers out of 62 questions

1 Answer

Best answer
0 0 votes

We compute each hidden state step-by-step using

$$ h_t = \text{ReLU}(W_{ih} \cdot x_t + W_{hh} \cdot h_{t-1}). $$

\( h_1 = \text{ReLU}(0.4 \cdot 3 + 0.6 \cdot 0) = 1.2 \)

\( h_2 = \text{ReLU}(0.4 \cdot 3 + 0.6 \cdot 1.2) = 1.92 \)

\( h_3 = \text{ReLU}(0.4 \cdot 3 + 0.6 \cdot 1.92) = 2.352 \)

\( h_4 = \text{ReLU}(0.4 \cdot 3 + 0.6 \cdot 2.352) = 2.6112 \)

Final Answer: \( h_4 = 2.6112 \)

Related questions

1 1 vote
1 1 answer
1.3k
1.3k views
mcneils asked Mar 18, 2019
1,273 views
How do you determine the weight values that connect to the other data points when solving for our output in neural networks?
0 0 votes
1 answers 1 answer
2.5k
2.5k views
tofighi asked Oct 2, 2024
2,465 views
For the following neural network, calculate accuracy of classification, given these settings
3 3 votes
1 answers 1 answer
9.4k
9.4k views
tofighi asked Apr 4, 2019
9,399 views
In the figure below, a neural network is shown. Calculate the following:1) How many neurons do we have in the input layer and the output layer?2) How many hidden layers d...
1 1 vote
1 answers 1 answer
3.6k
3.6k views
tofighi asked Oct 30, 2018
3,564 views
Both of the batch size and number of epochs are integer values and seem to do the same thing in Stochastic gradient descent. What are these two hyper-parameters of this l...
5 5 votes
1 answers 1 answer
9.1k
9.1k views
tofighi asked Jun 26, 2019
9,103 views
Assume we have a $5\times5$ px RGB image with 3 channels respectively for R, G, and B. IfR2000012001201021210101020G0212211100002202002002111B0100111201102021011012112 We...