-
Notifications
You must be signed in to change notification settings - Fork 341
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Align with llama.cpp b1488 #249
Conversation
@SignalRT I've started a GitHub action run to build the binaries for all platforms (currently running here). I'll PR them into this branch once the run has finished. |
@martindevans , see this other PR #241 because it includes the Intel build, and maybe should be included. It will enable to pass the test on GitHub con MacOS intel before they allow to run the test on MacOS arm. |
Related to that other PR, I just opened up this issue: #251 |
…arp/actions/runs/6762323560 (132d25b8a62ea084447e0014a0112c1b371fb3f8)
I just pushed binaries into this PR. To be honest I expected to have to open up a PR into your fork, I didn't expect to be able to push them directly! I added all the binaries from this run except for the MacOS ones. Those can be added too if you want, but I'll leave it up to you. |
…arp/actions/runs/6762323560 Add the MacOS binary from the same run
@martindevans I added the MacOS binaries from the same run and passed all the test manually successfully. |
In response of @martindevans #245 (comment) request, I have checked out https://github.com/SignalRT/LLamaSharp/tree/Align-Current-Binaries and build them. Then I have referenced
while if I reference |
You probably need to rename |
@martindevans ok, I did not realize this is p/invoke. Rename helped.
And as a bonus the newest version is consistently faster than v0.5.1. |
After this PR has been merge, is everything about binary ok to publish a new release? Because of the performance problem in 0.7.0, I think we should add a release 0.7.1. |
If possible can we wait for the Intel macos PR (#258) because that will act as a test case for the runtime feature detection. |
Sure :) |
@SignalRT , @martindevans, has 0.8.0 Cuda12 been tested on Windows? |
I believe @AsakusaRinne tested them? |
Hm, this is fine on Rider, however LINQPad 7 returns that |
I tested with VS and it worked fine. Have you ever had problem when using previous versions of LLamaSharp with LinqPad? |
@AsakusaRinne, no, older package versions work just fine |
Align with the latest llama.cpp binaries.