API Reference#

This section contains the API reference for the Warp-NN library.

Core modules#

Module

Base abstract class for all the modules.

Parameter

Class representing a learnable parameter.

Activations#

ELU

Exponential Linear Unit (ELU) activation function.

LeakyReLU

Leaky Rectified Linear Unit (Leaky ReLU) activation function.

ReLU

Rectified Linear Unit (ReLU) activation function.

SELU

Scaled Exponential Linear Unit (SELU) activation function.

Sigmoid

Sigmoid activation function.

SoftPlus

Soft-plus activation function.

SoftSign

Soft-sign activation function.

Tanh

Hyperbolic Tangent (Tanh) activation function.

Initializers#

constant

Initialize the array with a constant value.

kaiming_normal

Initialize the array using the Kaiming (aka He) normal initialization method.

kaiming_uniform

Initialize the array using the Kaiming (aka He) uniform initialization method.

ones

Initialize the array with ones.

zeros

Initialize the array with zeros.

Layers#

Conv1D

Apply a 1D convolution.

Conv2D

Apply a 2D convolution.

GRUCell

Apply a Gated Recurrent Unit (GRU) cell.

Linear

Apply a linear transformation over the final dimension of the input.

LSTMCell

Apply a Long Short-Term Memory (LSTM) cell.

RNNCell

Apply a Elman's Recurrent Neural Network (RNN) cell.

Sequential

Apply callable modules (e.g. layers, activation functions, etc.) connected in a cascading sequence.

Optimizers#

Adam

Adam optimizer.

SGD

SGD optimizer.