-
Notifications
You must be signed in to change notification settings - Fork 311
Issues: NVIDIA/TransformerEngine
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
TransformerEngine install fails with no clear cause
bug
Something isn't working
build
Build system
#1249
opened Oct 14, 2024 by
sytelus
How about the torch.compile in TransformerEngine ?
question
Further information is requested
#1241
opened Oct 11, 2024 by
south-ocean
Bug in TransformerEngine v1.11 for PyTorch when using flash-attn>=2.5.7
#1236
opened Oct 10, 2024 by
saimidu
No option to change FP8 status in graphed module after using "make_graphed_callables"
bug
Something isn't working
#1207
opened Sep 26, 2024 by
MaciejBalaNV
FP8 for norm inputs and residuals?
question
Further information is requested
#1193
opened Sep 21, 2024 by
cbcase
[PyTorch] FP8 and activation checkpointing causes training instabilities
#1190
opened Sep 18, 2024 by
Marks101
AssertionError: Outputs not close enough in tensor in test_numerics.py
bug
Something isn't working
#1165
opened Sep 6, 2024 by
sirutBuasai
AssertionError: Device compute capability 8.9 or higher required for FP8 execution.
#1159
opened Sep 5, 2024 by
kamrul-NSL
[PyTorch] Bug in FP8 buffer update causing training instabilities
#1047
opened Jul 26, 2024 by
Marks101
Previous Next
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.