-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add olmo and qwen suts #601
Conversation
MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm a little concerned about the stability of that Olmo URL, but that's what they're advertising in the dashboard at the moment, and I don't see an API to fetch it, so let's run with this and see how it goes.
SUTS.register( | ||
HuggingFaceSUT, | ||
"olmo-7b-0724-instruct-hf", | ||
"https://flakwttqzmq493dw.us-east-1.aws.endpoints.huggingface.cloud", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Long term, I'd like to create a layer that lets a more user-y person say they want olmo-7b-0724-instruct
for something, and then lets an ops person say, "we prefer to use huggingface for that and here's how". And presumably that latter bit will be via a configuration file. That would help us, and it would also help other users of our code, because they may want to use a different llamaguard provider, including an internal one, rather than the one we've hardcoded.
But for now updating the source isn't much harder than updating a cname. And a cname only solves the problem for us, not for other people using this code. So I'm fine with just leaving this like this and then fixing it when we have more information about where we need things to flex.
Olmo isn't accessible via the chat_completion API so I created a new type of SUT.