relu¶
- nvtripy.relu(input: Tensor) Tensor[source]¶
Applies Rectified Linear Unit (RELU) function to each element of the input tensor:
\(\text{relu}(x) = \max(0,x)\)
- Parameters:
input (Tensor) – The input tensor.
- Returns:
A tensor of the same shape as the input.
- Return type:
- INPUT REQUIREMENTS:
input.dtypeis one of [float32,float16,bfloat16,int4,int32,int64,int8]- OUTPUT GUARANTEES:
return[0].dtype==input.dtype
Example
1input = tp.Tensor([1.0, 2.0, 3.0, 4.0]) 2output = tp.relu(input)
Local Variables¶>>> input tensor([1, 2, 3, 4], dtype=float32, loc=cpu:0, shape=(4,)) >>> output tensor([1, 2, 3, 4], dtype=float32, loc=gpu:0, shape=(4,))