LayerNorm

class LayerNorm(normalized_shape, eps=1e-05, affine=True, **kwargs)[源代码]

Applies Layer Normalization over a mini-batch of inputs Refer to Layer Normalization

\[y = \frac{x - \mathrm{E}[x]}{ \sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta\]

The mean and standard-deviation are calculated separately over the last certain number dimensions which have to be of the shape specified by normalized_shape. \(\\gamma\) and \(\\beta\) are learnable affine transform parameters of normalized_shape if affine is True. The standard-deviation is calculated via the biased estimator.

注解

Unlike Batch Normalization and Instance Normalization, which applies scalar scale and bias for each entire channel/plane, Layer Normalization applies per-element scale and bias.

参数
  • normalized_shape (int or tuple) – input shape from an expected input of size size \([*, normalized\_shape[0], normalized\_shape[1], ..., normalized\_shape[-1]]\). If it is a single integer, this module will normalize over the last dimension which is expected to be of that specific size.

  • eps – 添加到分母的单个值,增加数值稳定性。默认:1e-5

  • affine – this module has learnable affine parameters (weight, bias) when affine is set to be True.

形状:
  • Input: \((N, *)\) (2-D, 3-D, 4-D or 5-D tensor)

  • Output: \((N, *)\) (same shape as input)

实际案例

>>> import numpy as np
>>> inp = Tensor(np.arange(2 * 3 * 4 * 4).astype(np.float32).reshape(2, 3, 4, 4))
>>> m = M.LayerNorm((4, 4))
>>> out = m(inp)
>>> out.numpy().shape
(2, 3, 4, 4)