这里会显示出您选择的修订版和当前版本之间的差别。
| 两侧同时换到之前的修订记录 前一修订版 后一修订版 | 前一修订版 | ||
|
变长序列处理 [2019/11/10 17:03] admin |
变长序列处理 [2020/02/01 21:41] (当前版本) |
||
|---|---|---|---|
| 行 1: | 行 1: | ||
| + | ====== Pytorch Pack目的 ====== | ||
| + | [[https://stackoverflow.com/questions/51030782/why-do-we-pack-the-sequences-in-pytorch|why do we “pack” the sequences in pytorch? | ||
| + | ]] | ||
| + | |||
| I have stumbled upon this problem too and below is what I figured out. | I have stumbled upon this problem too and below is what I figured out. | ||
| 行 9: | 行 13: | ||
| Here a code example: | Here a code example: | ||
| - | a = [torch.tensor([1,2,3]), torch.tensor([3,4])] | + | a = [torch.tensor([1,2,3]), torch.tensor([3,4])] |
| - | b = torch.nn.utils.rnn.pad_sequence(a, batch_first=True) | + | b = torch.nn.utils.rnn.pad_sequence(a, batch_first=True) |
| - | >>>> | + | >>>> |
| - | tensor([[ 1, 2, 3], | + | tensor([[ 1, 2, 3], |
| - | [ 3, 4, 0]]) | + | [ 3, 4, 0]]) |
| - | torch.nn.utils.rnn.pack_padded_sequence(b, batch_first=True, lengths=[3,2]) | + | torch.nn.utils.rnn.pack_padded_sequence(b, batch_first=True, lengths=[3,2]) |
| - | >>>>PackedSequence(data=tensor([ 1, 3, 2, 4, 3]), batch_sizes=tensor([ 2, 2, 1])) | + | >>>>PackedSequence(data=tensor([ 1, 3, 2, 4, 3]), batch_sizes=tensor([ 2, 2, 1])) |