Fix: aten::matmul converter behavior with 1d tensors #2450
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Previously the aten::matmul converter always left padded tensors to match dims between self and other. This behavior does not match PyTorch which right pads 1D tensors by 1 for other > self and left pads by 1 for self > other, removing the padded dim after the matmul in both cases.
Fixes # (issue)
Type of change
Please delete options that are not relevant and/or add your own.
Checklist: