-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
increase embedding provider support #152
Comments
not only embedding but also other things beside OLLAMA like calling api can help a lot, for example I can not use it with persian embedding and some other models. |
We do have support for Ollama (thanks to @destrex271) - but we are pretty light on documentation on using it is. https://github.com/tembo-io/pg_vectorize/blob/main/core/src/transformers/providers/ollama.rs |
can we change this
can you change this |
Yes it can be changed. It is a configuration is Postgres: When you run this with docker-compose it gets set to the docker service name by default.
but it can be set to whatever value you want, e.g. Similarly, you can change the OpenAI url so long as the server running has the same API schema as OpenAI
All the Postgres settings that can be changed are defined here: https://github.com/tembo-io/pg_vectorize/blob/main/extension/src/guc.rs |
Let me try it |
Let me know if you run into any issues. Can ping me here, or in the Tembo community slack. |
💎 $150 bounty • TemboSteps to solve:
Thank you for contributing to tembo-io/pg_vectorize! |
add support for embeddings from:
Embedding providers are added by implementing the required traits. For example see the implementation for Cohere
The text was updated successfully, but these errors were encountered: