Activations#

Activations are non-linear functions that transform the input data into a new representation.

The following table lists the available activations:

ELU

Exponential Linear Unit (ELU) activation function.

LeakyReLU

Leaky Rectified Linear Unit (Leaky ReLU) activation function.

ReLU

Rectified Linear Unit (ReLU) activation function.

SELU

Scaled Exponential Linear Unit (SELU) activation function.

Sigmoid

Sigmoid activation function.

SoftPlus

Soft-plus activation function.

SoftSign

Soft-sign activation function.

Tanh

Hyperbolic Tangent (Tanh) activation function.