Issue
I have a tensor with size: torch.Size([118160, 1])
. What I want to do is split it up into n tensors with 100 elements each, sliding by 50 elements at a time. What's the best way to achieve this with PyTorch?
Solution
A possible solution is:
window_size = 100
stride = 50
splits = [x[i:min(x.size(0),i+window_size)] for i in range(0,x.size(0),stride)]
However, the last few elements will be shorter than window_size
. If this is undesired, you can do:
splits = [x[i:i+window_size] for i in range(0,x.size(0)-window_size+1,stride)]
EDIT:
A more readable solution:
# if keep_short_tails is set to True, the slices shorter than window_size at the end of the result will be kept
def window_split(x, window_size=100, stride=50, keep_short_tails=True):
length = x.size(0)
splits = []
if keep_short_tails:
for slice_start in range(0, length, stride):
slice_end = min(length, slice_start + window_size)
splits.append(x[slice_start:slice_end])
else:
for slice_start in range(0, length - window_size + 1, stride):
slice_end = slice_start + window_size
splits.append(x[slice_start:slice_end])
return splits
Answered By - Yahia Zakaria
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.