Leaky Rectified Linear Unit (Leaky ReLU)#

API#

class warp_nn.modules.activations.LeakyReLU(*, negative_slope: float = 0.01)[source]#

Bases: Module

Leaky Rectified Linear Unit (Leaky ReLU) activation function.

This class computes the element-wise Leaky ReLU activation function:

\[\begin{split}\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0\\ \alpha \, x, & \text{ if } x < 0 \end{cases}\end{split}\]

where

\[\alpha = \text{negative\_slope}\]
Parameters:

negative_slope – The negative slope value for the Leaky ReLU function.

__call__(
input: array,
) array[source]#

Forward pass of the activation function.

Parameters:

input – The input array, with up to 3 dimensions.

Returns:

The output array, with same shape as the input array.

property negative_slope[source]#

The negative slope value for the Leaky ReLU function.