The hidden state of GridLSTM is not updated:
|
for dim in range(self.dimensions): |
|
c = C[dim] |
|
if dim in self.lstm_dim_map: |
|
# LSTM update |
|
cell = self.lstms[self.lstm_dim_map[dim]] |
|
h, c = cell.forward(inp, (H, c)) |
|
else: |
|
# Non-LSTM update |
|
layer = self.nonlstms[self.nonlstm_dim_map[dim]] |
|
h = layer.forward(H) |
|
local_state[dim] = (h, c) |
BTW, pytorch will check the consistency of the input hidden tensor's size and the self.hidden_size:
https://github.com/pytorch/pytorch/blob/6ebcb4606f079b9152cb242b36e03b8eddcb6173/torch/nn/modules/rnn.py#L504-L513
But GridLSTM's input H is concatenated from all the h which is larger than your setting:


|
self.hidden_size = output_hidden_channels |
The hidden state of GridLSTM is not updated:
GridLSTM-PyTorch/gridlstm.py
Lines 191 to 201 in fe063f2
BTW, pytorch will check the consistency of the input hidden tensor's size and the
self.hidden_size:https://github.com/pytorch/pytorch/blob/6ebcb4606f079b9152cb242b36e03b8eddcb6173/torch/nn/modules/rnn.py#L504-L513
But GridLSTM's input


His concatenated from all thehwhich is larger than your setting:GridLSTM-PyTorch/gridlstm.py
Line 223 in fe063f2