Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP][AMDAIEPackAndTranpose] Undo folding of rank-preserving packs #787

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

newling
Copy link
Contributor

@newling newling commented Sep 19, 2024

The function linalg::pack doesn't create rank-preserving pack ops, see: https://github.com/llvm/llvm-project/blob/644899addd8fd789c93e9a0f0727d37eb1b29c55/mlir/lib/Dialect/Linalg/Transforms/Transforms.cpp#L542

The upstream design cannot easily be undone, because it is thoroughly tested for. See: https://github.com/llvm/llvm-project/blob/644899addd8fd789c93e9a0f0727d37eb1b29c55/mlir/test/Dialect/Linalg/transform-op-pack.mlir

In our use case it is sometimes useful to have identity pack ops, specifically if we don't want to pack any dimensions of an operand, but still want to set up a new tensor for bufferization.

The use case I have for this is channel-first convolution. This PR is not strictly necessary because I have an alternative (packing input and output channel dimensions with size 1) but IMO this is maybe neater.

If there's general approval I'll add tests.

@newling newling changed the title AMDAIEPackAndTranpose -- undo upstream's folding of rank-preserving packs [AMDAIEPackAndTranpose] undo upstream's folding of rank-preserving packs Sep 19, 2024
@newling newling changed the title [AMDAIEPackAndTranpose] undo upstream's folding of rank-preserving packs [WIP][AMDAIEPackAndTranpose] Undo folding of rank-preserving packs Sep 19, 2024
@newling newling marked this pull request as draft October 7, 2024 17:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant