softmax#

Reduce to softmax values in input. Softmax is defined as:

\[\sigma(z_i) = \frac{e^{z_{i}}}{\sum_{j=1}^K e^{z_{j}}} \ \ \ for\ i=1,2,\dots,K\]
template<typename InType, int D>
__MATX_INLINE__ auto matx::softmax(const InType &in, const int (&dims)[D])#

Calculate the softmax of values in a tensor treated as a flat vector

softmax computes the exponential of each value divided by the sum of the exponentials of items in the reduced set. The axes in which to perform the softmax over determine which axes the sum will be computed over, but the input tensor rank and sizes match between input and output. Note that traditional definitions of softmax are simply exp(x)/sum(exp(x)), but this is not how most libraries are implemented. Instead, x is biased by a correction factor of max(x).

Template Parameters:
  • InType – Input data type

  • D – Rank of dimension array

Parameters:
  • in – Input data to compute the softmax

  • dims – C-style array containing the dimensions to sum over

template<typename InType>
__MATX_INLINE__ auto matx::softmax(const InType &in)#

Calculate the softmax of values in a tensor treated as a flat vector

softmax computes the exponential of each value divided by the sum of the exponentials of items in the reduced set. The axes in which to perform the softmax over determine which axes the sum will be computed over, but the input tensor rank and sizes match between input and output. Note that traditional definitions of softmax are simply exp(x)/sum(exp(x)), but this is not how most libraries are implemented. Instead, x is biased by a correction factor of max(x).

Template Parameters:

InType – Input data type

Parameters:

in – Input data to compute the softmax

Examples#

(t1_out = softmax(t1)).run(exec);

Examples#

(t3_out = softmax(t3, {2})).run(exec);