You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We will start from deploying Kaito in AWS since AWS does support Karpenter, making the integration easier.
Fei-Guo
changed the title
Onboard Katio to Kubernetes services hosted by other cloud vendor
Onboard Katio to Kubernetes services hosted by other cloud vendors
May 30, 2024
Hey all, just quick question does this feature enhancement will it include self-hosted kubernetes, I checked a few places but wasn't sure so I figured maybe this could be right place to see if this is considered?
The consideration is that some will need for self-host community, home-labs and companies, etc that need the llms to be ran locally.
Hey all, just quick question does this feature enhancement will it include self-hosted kubernetes, I checked a few places but wasn't sure so I figured maybe this could be right place to see if this is considered?
The consideration is that some will need for self-host community, home-labs and companies, etc that need the llms to be ran locally.
You can run Kaito in selfmanaged k8s if you already add GPU nodes in the cluster (with proper gpu driver and k8s plugin installed). In this case, you can just add those nodes in the Kaito workspace CR as preferrednodes in the Resource spec. Kaito will skip provisioning gpu nodes and just run inference workload in the existing nodes.
Tasks
The text was updated successfully, but these errors were encountered: