You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using a matrix factorization (mf) router in a RAG application and want to download the MF model to my local system. Is this possible? Could you also explain how it works internally? Additionally, can we change the embedding model used by the MF router?
The text was updated successfully, but these errors were encountered:
Yes, it is possible to download this from HuggingFace! https://huggingface.co/routellm/mf_gpt4_augmented. However, note that you will still have to generate embeddings for OpenAI as input for the time being - @thwu1 is working on adding support for other embedding models in #17!
Could you please provide more details or share the implementation code for integrating this into a local system? Specifically, I'm interested in how to set up and use the model, including any necessary configurations and dependencies.
I am using a matrix factorization (mf) router in a RAG application and want to download the MF model to my local system. Is this possible? Could you also explain how it works internally? Additionally, can we change the embedding model used by the MF router?
The text was updated successfully, but these errors were encountered: