relu

nvtripy.relu(input: Tensor) Tensor[source]

Applies Rectified Linear Unit (RELU) function to each element of the input tensor:

\(\text{relu}(x) = \max(0,x)\)

Parameters:

input (Tensor) – [dtype=T1] The input tensor.

Returns:

[dtype=T1] A tensor of the same shape as the input.

Return type:

Tensor

DATA TYPE CONSTRAINTS:
Example
1input = tp.Tensor([1.0, 2.0, 3.0, 4.0], dtype=tp.float32)
2output = tp.relu(input)
Local Variables
>>> input
tensor([1, 2, 3, 4], dtype=float32, loc=cpu:0, shape=(4,))

>>> output
tensor([1, 2, 3, 4], dtype=float32, loc=gpu:0, shape=(4,))