Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs: update torch export docs with modern canonical usage #18564

Closed
stellaraccident opened this issue Sep 20, 2024 · 6 comments · Fixed by #18706
Closed

Docs: update torch export docs with modern canonical usage #18564

stellaraccident opened this issue Sep 20, 2024 · 6 comments · Fixed by #18706
Assignees
Labels
documentation ✏️ Improvements or additions to documentation integrations/pytorch PyTorch integration work

Comments

@stellaraccident
Copy link
Collaborator

  • move torch.compile support to an advanced/experimental feature category
  • Show canonical "two liner" of how to compile with separate weights and save to a safetensors file
  • Include iree-run-module example
  • Remove any examples that include jittable
  • Show both external and inlined parameter examples

See: https://discourse.llvm.org/t/how-to-separate-the-body-and-dialect-resources-from-a-module-in-torch-mlir-and-how-to-reload-it/81338/2?u=stellaraccident

@ScottTodd ScottTodd added documentation ✏️ Improvements or additions to documentation integrations/pytorch PyTorch integration work labels Sep 23, 2024
@kumardeepakamd
Copy link
Contributor

Did you mean torchscript.compile @stellaraccident in your comment?

@stellaraccident
Copy link
Collaborator Author

Referring to torch.compile here: https://iree.dev/guides/ml-frameworks/pytorch/#just-in-time-jit-execution

It is really just experimental right now and needs to be called out as such.

@vinayakdsci
Copy link
Contributor

vinayakdsci commented Sep 25, 2024

@stellaraccident just to be sure, I and @kumardeepakamd looked into this, and we had some questions.

  1. move torch.compile support to an advanced/experimental feature category : shark_turbine.aot to shark_turbine.experimental.aot?
  2. Show canonical "two liner" of how to compile with separate weights and save to a safetensors file : Okay, can figure this out
  3. Include iree-run-module example : SG
  4. Remove any examples that include jittable : Remove Just-in-time (JIT) execution section in https://iree.dev/guides/ml-frameworks/pytorch/ ?
  5. Show both external and inlined parameter examples : Show inlined and non-inlined weights? The weight as a separate safetensor is one covered in above bullet, show the other one essentially when it is inlined?

@ScottTodd
Copy link
Member

  1. move torch.compile support to an advanced/experimental feature category : shark_turbine.aot to shark_turbine.experimental.aot?

My interpretation is that this focuses just on the documentation for the code as it already exists. Separately we should do the work to make the code more stable. (Also please someone pick up iree-org/iree-turbine#28 - we shouldn't be using the shark_turbine namespace this many months later)

@kumardeepakamd
Copy link
Contributor

kumardeepakamd commented Sep 25, 2024

So, https://github.com/iree-org/iree-turbine/blob/main/iree/turbine/__init__.py still seems to use shark_turbine? Is someone working on changing that to iree_turbine? or should we take care of that?

@stellaraccident
Copy link
Collaborator Author

@vinayakdsci @kumardeepakamd

Another user having an issue with jittable. Can you please respond to them.

https://discourse.llvm.org/t/jittable-error-of-iree-turbine/81457/2?u=stellaraccident

ScottTodd pushed a commit that referenced this issue Sep 27, 2024
Progresses towards #18564.

Fixes an example involving `shark_turbine.aot` that produced a segfault
due to a call to `VmModule.wrap_buffer()` when run.

Also removes examples related to `aot.jittable` that were present in the
doc, and updates the recommended PyTorch version.
vinayakdsci added a commit that referenced this issue Oct 3, 2024
…8658)

Progress on #18564.

Adds examples to the PyTorch guide, showing how to externalize module
parameters, and load them at runtime, both through command line
(`iree-run-module`) and through the iree-runtime Python API (using
`ParameterIndex`).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation ✏️ Improvements or additions to documentation integrations/pytorch PyTorch integration work
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants