SiLU

class SiLU(name=None)[源代码]

对每个元素应用函数:

\[\text{SiLU}(x) = \frac{x}{1 + \exp(-x)}\]

实际案例

>>> import numpy as np
>>> data = mge.tensor(np.array([-2,-1,0,1,2,]).astype(np.float32))
>>> silu = M.SiLU()
>>> output = silu(data)
>>> with np.printoptions(precision=6):
...     print(output.numpy())
[-0.238406 -0.268941  0.        0.731059  1.761594]