silu

nvtripy.silu(input: Tensor) Tensor[source]

Applies the Sigmoid Linear Unit (SiLU) function to each element of the input tensor. This function is also known as the swish function.

\(\text{silu}(x) = x \cdot \text{sigmoid} (x)\)

where:

\(\text{sigmoid} (x)_i = \Large \frac{1}{1 + e^{-x_i}}\)

Parameters:

input (Tensor) – The input tensor.

Returns:

A tensor of the same shape as the input.

Return type:

Tensor

INPUT REQUIREMENTS:

input.dtype is one of [float32, float16, bfloat16, int8]

OUTPUT GUARANTEES:

return[0].dtype == input.dtype

Example
1input = tp.Tensor([1.0, 2.0, 3.0, 4.0])
2output = tp.silu(input)
Local Variables
>>> input
tensor([1, 2, 3, 4], dtype=float32, loc=cpu:0, shape=(4,))

>>> output
tensor([0.731059, 1.76159, 2.85772, 3.92806], dtype=float32, loc=gpu:0, shape=(4,))