"""Recurrent module interface.
This class defines the interface of a recurrent module PFRL support.
The interface is similar to that of `torch.nn.LSTM` except that sequential
data are expected to be packed in `torch.nn.utils.rnn.PackedSequence`.
To implement a model with recurrent layers, you can either use
default container classes such as
`pfrl.nn.RecurrentBranched` or write your module
extending this class and `torch.nn.Module`.
[docs] def forward(self, packed_input, recurrent_state):
"""Multi-step batch forward computation.
packed_input (object): Input sequences. Tensors must be packed in
recurrent_state (object or None): Batched recurrent state.
If set to None, it is initialized.
object: Output sequences. Tensors will be packed in
object or None: New batched recurrent state.