Issue
Say I have a 4 batch of 5x3 matrixes. So the dimensions of these tensor are 4x5x3. How do I take the transpose of each matrix within each batch. So converting it to 4x3x5?
Solution
I will drop some benchmarks here for the sake of performance. Using the same tensor proposed in the OP's answer.
In[2]: import torch
In[3]: x = torch.randn(2, 3, 5)
In[4]: x.size()
Out[4]: torch.Size([2, 3, 5])
In[5]: %timeit x.permute(1, 0, 2)
1.03 µs ± 41.7 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)
In[6]: %timeit torch.transpose(x, 0, 1)
892 ns ± 9.61 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)
In[7]: torch.transpose(x, 0, 1).equal(x.permute(1, 0, 2))
Out[7]: True
It is clear that torch.transpose
is faster, so It is advised to use it when possible.
Answered By - Andrew Naguib
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.