megengine.functional.nn.softplus#

softplus(inp)[source]#

Applies the element-wise function:

\[\text{softplus}(x) = \log(1 + \exp(x))\]

softplus is a smooth approximation to the ReLU function and can be used to constrain the output to be always positive. For numerical stability the implementation follows this transformation:

\[\text{softplus}(x) = \log(1 + \exp(x)) = \log(1 + \exp(-\text{abs}(x))) + \max(x, 0) = \log1p(\exp(-\text{abs}(x))) + \text{relu}(x)\]

Examples

>>> import numpy as np
>>> x = Tensor(np.arange(-3, 3, dtype=np.float32))
>>> y = F.softplus(x)
>>> y.numpy().round(decimals=4)
array([0.0486, 0.1269, 0.3133, 0.6931, 1.3133, 2.1269], dtype=float32)
Return type:

Tensor