Recurrent Neural Network Cell (RNNCell)#
API#
- class warp_nn.modules.layers.RNNCell(input_size: int, hidden_size: int, *, bias: bool = True)[source]#
Bases:
ModuleApply a Elman’s Recurrent Neural Network (RNN) cell.
\[\text{RNNCell}(x, h) = \text{tanh}(W_{ih} \, x + b_{ih} + W_{hh} \, h + b_{hh})\]
Learnable parameters:
Name
Shape
Description
\(W_{ih}\)
weight_ih(hidden_size, input_size)Input-to-hidden weights
\(W_{hh}\)
weight_hh(hidden_size, hidden_size)Hidden-to-hidden weights
\(b_{ih}\)
bias_ih(hidden_size, 1)Input-to-hidden bias. Only if
biasis true\(b_{hh}\)
bias_hh(hidden_size, 1)Hidden-to-hidden bias. Only if
biasis trueThe parameters are initialized from the uniform distribution \(u(-k, k)\) where \(k = \frac{1}{\sqrt{\text{hidden\_size}}}\).
- Parameters:
input_size – The number of input features.
hidden_size – The number of hidden features.
bias – Whether to include a bias term.