LSTMCell

class LSTMCell(input_size, hidden_size, bias=True)[源代码]

A long short-term memory (LSTM) cell.

\[\begin{split}\begin{array}{ll} i = \sigma(W_{ii} x + b_{ii} + W_{hi} h + b_{hi}) \\ f = \sigma(W_{if} x + b_{if} + W_{hf} h + b_{hf}) \\ g = \tanh(W_{ig} x + b_{ig} + W_{hg} h + b_{hg}) \\ o = \sigma(W_{io} x + b_{io} + W_{ho} h + b_{ho}) \\ c' = f * c + i * g \\ h' = o * \tanh(c') \\ \end{array}\end{split}\]

其中,\(\sigma\) 是 Hadamard 积。

参数
  • input_size (int) – 输入 `x ` 中的预期特征的数量

  • hidden_size (int) – Hidden state h 中特征的数量

  • bias (bool) – 如果 False,那么该层不使用偏置权重 b_ihb_hh。默认值: True

Shape:
  • Inputs: input, (h_0, c_0)

    input: (batch, input_size). Tensor containing input features. h_0: (batch, hidden_size). Tensor containing the initial hidden state for each element in the batch. c_0: (batch, hidden_size). Tensor containing the initial cell state for each element in the batch. If (h_0, c_0) is not provided, both h_0 and c_0 default to zero.

  • 输出:(h_1, c_1)

    h_1: (batch, hidden_size). Tensor containing the next hidden state for each element in the batch. c_1: (batch, hidden_size). Tensor containing the next cell state for each element in the batch.

实际案例

import numpy as np
import megengine as mge
import megengine.module as M

m = M.LSTMCell(10, 20)
inp = mge.tensor(np.random.randn(3, 10), dtype=np.float32)
hx = mge.tensor(np.random.randn(3, 20), dtype=np.float32)
cx = mge.tensor(np.random.randn(3, 20), dtype=np.float32)
hy, cy = m(inp, (hx, cx))
print(hy.numpy().shape)
print(cy.numpy().shape)

输出:

(3, 20)
(3, 20)