-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ProteomicsLFQ memory consumed #432
Comments
I have analysed this dataset many times. Never had issues. |
You can trace the memory consumption while it's running. |
Me too, but the last release of quantms 1.3.0 uses OpenMS 3.2.0. We have other ongoing problems with this version and mzTab export in ProteinQuantifier. @timosachsenberg Can you help us here? |
log does not indicate that it is the export. is there a way we can find out where/when this regression was introduced? |
I retried openms 3.2.0 and traced the memory. It does exceed the memory. Why is it out of memory? mzML only has 10G. I haven't encountered this before either |
Thanks for checking. This it is really suspicious. Can you reproduce this e.g., for one or two files? |
I can reproduce this in two files. But a single file are work. Test files: https://www.dropbox.com/scl/fi/jgbw0pvnm18cga1kwgy54/proteomicslfq.zip?rlkey=6igoyec9ffztk9p8f4uriukct&st=osldx9cp&dl=0 |
I can confirm that it uses 400gb for two small files during feature extraction. |
Likely related to a different conversion using TRFP. |
Description of the bug
The errors are reported when i ran PXD001819 LFQ datasets (about 10G mzML files). It looks like it's running out of memory? But the available memory is 120G. So I'm not sure if this is normal or not.
Command used and terminal output
Relevant files
log file: proteomicslfq.log
System information
quantms 1.3.0
The text was updated successfully, but these errors were encountered: