silu¶
- tripy.silu(input: Tensor) Tensor [source]¶
Applies the Sigmoid Linear Unit (SiLU) function to each element of the input tensor. This function is also known as the swish function.
\(\text{silu}(x) = x \cdot \sigma (x)\) where \(\sigma (x)_i = \frac{1}{1 + \exp{-x_i}}\)
- Parameters:
input (Tensor) – [dtype=T1] The input tensor.
- Returns:
[dtype=T1] A tensor of the same shape as the input.
- Return type:
Example
1input = tp.Tensor([1., 2., 3., 4.], dtype=tp.float32) 2output = tp.silu(input)
>>> input tensor([1.0000, 2.0000, 3.0000, 4.0000], dtype=float32, loc=gpu:0, shape=(4,)) >>> output tensor([0.7311, 1.7616, 2.8577, 3.9281], dtype=float32, loc=gpu:0, shape=(4,))