Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama run minicpm-v, runs on cpu #12257

Open
juan-OY opened this issue Oct 23, 2024 · 1 comment
Open

ollama run minicpm-v, runs on cpu #12257

juan-OY opened this issue Oct 23, 2024 · 1 comment

Comments

@juan-OY
Copy link

juan-OY commented Oct 23, 2024

使用ollama运行minicpm-v模型,调用过程中发现,单独调用llm文字部分,正常运行到igpu。
但是同时使用图片和文字,会出先LLM运行到CPU上。

ollama run minicpm-v:latest

Test prompt
{
"model": "minicpm-v:latest",
"prompt": "图片讲了什么内容?",
"images":["C:\Users\MTL\Pictures\test.jpg"]}

@sgwhat
Copy link
Contributor

sgwhat commented Oct 24, 2024

Hi @juan-OY , we have reproduced your issue and we are looking for a solution currently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants