Bases: Module
Rectified Linear Unit (ReLU) activation function.
This class computes the element-wise ReLU activation function:
\[\begin{split}\text{ReLU}(x) = \begin{cases}
x, & \text{ if } x \geq 0\\
0, & \text{ if } x < 0
\end{cases}\end{split}\]
-
__call__(input: array) → array[source]
Forward pass of the activation function.
- Parameters:
input – The input array, with up to 3 dimensions.
- Returns:
The output array, with same shape as the input array.