Skip to content

feat: implement inference server by using vllm #1106

feat: implement inference server by using vllm

feat: implement inference server by using vllm #1106

Triggered via pull request October 15, 2024 08:38
Status Success
Total duration 29s
Artifacts

dependency-review.yml

on: pull_request
dependency-review
17s
dependency-review
Fit to window
Zoom out
Zoom in