megengine.functional.nn.softmax#

softmax(inp, axis=None)[源代码]#

Applies a \(\text{softmax}(x)\) function. \(\text{softmax}(x)\) is defined as:

\[\text{softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}\]

It is applied to all elements along axis, and rescales elements so that they stay in the range [0, 1] and sum to 1.

更多细节见 Softmax

实际案例

>>> import numpy as np
>>> x = Tensor(np.arange(-5, 5, dtype=np.float32)).reshape(2,5)
>>> out = F.softmax(x)
>>> out.numpy().round(decimals=4)
array([[0.0117, 0.0317, 0.0861, 0.2341, 0.6364],
       [0.0117, 0.0317, 0.0861, 0.2341, 0.6364]], dtype=float32)
返回类型:

Tensor