Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support parallel processing for Model 4 #5

Closed
JoelLinn opened this issue Jun 4, 2022 · 3 comments
Closed

Support parallel processing for Model 4 #5

JoelLinn opened this issue Jun 4, 2022 · 3 comments

Comments

@JoelLinn
Copy link

JoelLinn commented Jun 4, 2022

Model 4 should be easy to run on two cores, even with practically non-existent python parallelism.

@36grad
Copy link

36grad commented Apr 22, 2024

I have created a pull request that enables the facerecognition app to use multiple external model instances. So, while each external model will still be running on only one core, you can benefit from all cores you have.

See: matiasdelellis/facerecognition#743

@doppelgrau
Copy link
Contributor

Had made a "competing" one here in PR #20 where the container itself allows multiple connections.

@matiasdelellis
Copy link
Owner

Great. Thanks!.
Model 4 is not parallel, but since you can run multiple queries in parallel, I think the result is even better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants