Leaky Rectified Linear Unit (Leaky ReLU)#
API#
- class warp_nn.modules.activations.LeakyReLU(*, negative_slope: float = 0.01)[source]#
Bases:
ModuleLeaky Rectified Linear Unit (Leaky ReLU) activation function.
This class computes the element-wise Leaky ReLU activation function:
\[\begin{split}\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0\\ \alpha \, x, & \text{ if } x < 0 \end{cases}\end{split}\]where
\[\alpha = \text{negative\_slope}\]- Parameters:
negative_slope – The negative slope value for the Leaky ReLU function.