LeakyReLU

class LeakyReLU(negative_slope=0.01, **kwargs)[源代码]

对每个元素应用函数:

\[\text{LeakyReLU}(x) = \max(0,x) + negative\_slope \times \min(0,x)\]

或者

\[\begin{split}\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ negative\_slope \times x, & \text{ otherwise } \end{cases}\end{split}\]

实际案例

>>> import numpy as np
>>> data = mge.tensor(np.array([-8, -12, 6, 10]).astype(np.float32))
>>> leakyrelu = M.LeakyReLU(0.01)
>>> output = leakyrelu(data)
>>> output.numpy()
array([-0.08, -0.12,  6.  , 10.  ], dtype=float32)