Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Linux cuda crash #270

Closed
ChengYen-Tang opened this issue Nov 9, 2023 · 9 comments
Closed

Linux cuda crash #270

ChengYen-Tang opened this issue Nov 9, 2023 · 9 comments
Labels
bug Something isn't working

Comments

@ChengYen-Tang
Copy link
Contributor

I'm getting a crash using LLamaSharp.Backend.Cuda11
This problem occurs in 0.6.0 and 0.7.0, and is normal in 0.5.1
image

Env:
Ubuntu 22.04
image

@martindevans
Copy link
Member

We have a new set of binaries ready to release in the next version (very soon). Could you try pulling the master branch and testing if it fixes this issue?

@martindevans martindevans added the bug Something isn't working label Nov 11, 2023
@ChengYen-Tang
Copy link
Contributor Author

Same error

@martindevans
Copy link
Member

martindevans commented Nov 12, 2023

What CPU and GPU are you using?

Edit: Original post shows Titan RTX for the GPUs

@ChengYen-Tang
Copy link
Contributor Author

CPU: I7-8700
GPU: Titan RTX 24G

@AsakusaRinne
Copy link
Collaborator

It's quite weird because the only difference on compilation of native library since v0.6.0 is avx.

We've published v0.8.0 just now, with a fix for cuda library (but not sure if it's helpful for this issue) and the cuda feature detection in #275. Could you please have a try with it?

If it still doesn't work, you could add NativeLibraryConfig.WithLibrary({Your_DLL_PATH}) at the very beginning of your code, to load a cuda library compiled yourself. That could be a temporary fix.

@ChengYen-Tang
Copy link
Contributor Author

Oh, it’s normal
Another question, how should I stop printing logs?

@AsakusaRinne
Copy link
Collaborator

Oh, it’s normal

Is the v0.8.0 normal or your self-compiled library normal?

Another question, how should I stop printing logs?

Stopping logs from llama.cpp is an another issue. Currently you could try to redirect the outputs to other where.

@AsakusaRinne
Copy link
Collaborator

Update: this comment will allow you to capture the outputs from native library and output them anywhere else.

@ChengYen-Tang
Copy link
Contributor Author

Oh, it’s normal

Is the v0.8.0 normal or your self-compiled library normal?

Another question, how should I stop printing logs?

Stopping logs from llama.cpp is an another issue. Currently you could try to redirect the outputs to other where.

v0.8.0

Update: this comment will allow you to capture the outputs from native library and output them anywhere else.

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants