Skip to content

Inference with multiple backends? #5553

Closed Answered by dyastremsky
sabbih-shah asked this question in Q&A
Discussion options

You must be logged in to vote

There should be no issue with running inference on models using multiple backends. Triton will load them, as needed, based on the models you load.

That said, I don't think you can do the above with most of the backgrounds. Different model versions still use the same config, so they'll need to use the same backend. It may be possible to do the above with a custom backend or the Python backend.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by sabbih-shah
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants