Skip to content

Conversation

timbektu
Copy link

…st (batch_size) dimension, and the code breaks when using batch_size =1. pytorch lacks a functionality squeeze multiple dims at once, so using reshape() instead. patched the bug in this commit.

…st (batch_size) dimension, and the code breaks when using batch_size =1. patched the bug in this commit.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant