Issue
I am practicing on padding layers in PyTorch.
1d and 2d reflection padding works well.
When I try to run the example given in 3d padding, the error, given in the title, happened.
m = nn.ReflectionPad3d(1)
input = torch.arange(8, dtype=torch.float).reshape(1, 1, 2, 2, 2)
m(input)
What can be the reason for this error?
Solution
Unfortunately there is no ReflectionPad3d
in the official release yet. The Documentation you are referring to is addressed to the unstable developer preview. Have a look at the padding layers section of the newest stable version 1.9.0 to see which are the usable layers. Since the official issue on that topic is already closed, i am sure that it will make its way into the next release.
Answered By - e.Fro
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.