ReLU

class ReLU(name=None)[源代码]

Applies the rectified linear unit function element-wise:

\[\text{ReLU}(x) = (x)^+ = \max(x, 0)\]

实际案例

>>> import numpy as np
>>> data = mge.tensor(np.array([-2,-1,0,1,2,]).astype(np.float32))
>>> relu = M.ReLU()
>>> output = relu(data)
>>> with np.printoptions(precision=6):
...     print(output.numpy())
[0. 0. 0. 1. 2.]