API Reference#
This section contains the API reference for the Warp-NN library.
Core modules#
Activations#
Exponential Linear Unit (ELU) activation function. |
|
Leaky Rectified Linear Unit (Leaky ReLU) activation function. |
|
Rectified Linear Unit (ReLU) activation function. |
|
Scaled Exponential Linear Unit (SELU) activation function. |
|
Sigmoid activation function. |
|
Soft-plus activation function. |
|
Soft-sign activation function. |
|
Hyperbolic Tangent (Tanh) activation function. |
Initializers#
Initialize the array with a constant value. |
|
Initialize the array using the Kaiming (aka He) normal initialization method. |
|
Initialize the array using the Kaiming (aka He) uniform initialization method. |
|
Initialize the array with ones. |
|
Initialize the array with zeros. |
Layers#
Apply a 1D convolution. |
|
Apply a 2D convolution. |
|
Apply a Gated Recurrent Unit (GRU) cell. |
|
Apply a linear transformation over the final dimension of the input. |
|
Apply a Long Short-Term Memory (LSTM) cell. |
|
Apply a Elman's Recurrent Neural Network (RNN) cell. |
|
Apply callable modules (e.g. layers, activation functions, etc.) connected in a cascading sequence. |