softmax¶
- tripy.softmax(input: Tensor, dim: int = None) Tensor [source]¶
Applies the softmax function to the input tensor:
\(\text{softmax}(x_{i}) = \Large \frac{e^{x_{i}}}{\sum_{j=1}^N e^{x_{j}}} \normalsize for\ i=1,2,\dots,N\)
where \(x_{i}\) is the \(i^{th}\) element along dimension
dim
and \(N\) is the size of the dimension.Effectively, for each slice along
dim
, elements are scaled such that they lie in the range \([0, 1]\) and sum to 1.- Parameters:
input (Tensor) – [dtype=T1] The input tensor.
dim (int) – The dimension along which softmax will be computed. If this is
None
, softmax is applied over the whole input array.
- Returns:
[dtype=T1] A tensor of the same shape as the input.
- Return type:
Example
1input = tp.iota([2, 2], dtype=tp.float32) 2output = tp.softmax(input, dim=0)
>>> input tensor( [[0.0000, 0.0000], [1.0000, 1.0000]], dtype=float32, loc=gpu:0, shape=(2, 2)) >>> output tensor( [[0.2689, 0.2689], [0.7311, 0.7311]], dtype=float32, loc=gpu:0, shape=(2, 2))