LSTMCell¶

class LSTMCell(input_size, hidden_size, bias=True)[源代码]

A long short-term memory (LSTM) cell.

$\begin{split}\begin{array}{ll} i = \sigma(W_{ii} x + b_{ii} + W_{hi} h + b_{hi}) \\ f = \sigma(W_{if} x + b_{if} + W_{hf} h + b_{hf}) \\ g = \tanh(W_{ig} x + b_{ig} + W_{hg} h + b_{hg}) \\ o = \sigma(W_{io} x + b_{io} + W_{ho} h + b_{ho}) \\ c' = f * c + i * g \\ h' = o * \tanh(c') \\ \end{array}\end{split}$

where $$\sigma$$ is the sigmoid function, and $$*$$ is the Hadamard product.

• input_size (int) – The number of expected features in the input x

• hidden_size (int) – The number of features in the hidden state h

• bias (bool) – If False, then the layer does not use bias weights b_ih and b_hh. Default: True

Inputs: input, (h_0, c_0)
• input of shape (batch, input_size): tensor containing input features

• h_0 of shape (batch, hidden_size): tensor containing the initial hidden state for each element in the batch.

• c_0 of shape (batch, hidden_size): tensor containing the initial cell state for each element in the batch.

If (h_0, c_0) is not provided, both h_0 and c_0 default to zero.

Outputs: (h_1, c_1)
• h_1 of shape (batch, hidden_size): tensor containing the next hidden state for each element in the batch

• c_1 of shape (batch, hidden_size): tensor containing the next cell state for each element in the batch

import numpy as np
import megengine as mge
import megengine.module as M

m = M.LSTMCell(10, 20)
inp = mge.tensor(np.random.randn(3, 10), dtype=np.float32)
hx = mge.tensor(np.random.randn(3, 20), dtype=np.float32)
cx = mge.tensor(np.random.randn(3, 20), dtype=np.float32)
hy, cy = m(inp, (hx, cx))
print(hy.numpy().shape)
print(cy.numpy().shape)


(3, 20)
(3, 20)