Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

increase embedding provider support #152

Open
ChuckHend opened this issue Oct 12, 2024 · 7 comments
Open

increase embedding provider support #152

ChuckHend opened this issue Oct 12, 2024 · 7 comments
Labels
💎 Bounty enhancement New feature or request

Comments

@ChuckHend
Copy link
Member

ChuckHend commented Oct 12, 2024

add support for embeddings from:

Embedding providers are added by implementing the required traits. For example see the implementation for Cohere

@ChuckHend ChuckHend added the enhancement New feature or request label Oct 12, 2024
@tavallaie
Copy link

not only embedding but also other things beside OLLAMA like calling api can help a lot, for example I can not use it with persian embedding and some other models.

@ChuckHend
Copy link
Member Author

We do have support for Ollama (thanks to @destrex271) - but we are pretty light on documentation on using it is.

https://github.com/tembo-io/pg_vectorize/blob/main/core/src/transformers/providers/ollama.rs

@tavallaie
Copy link

tavallaie commented Oct 14, 2024

can we change this

We do have support for Ollama (thanks to @destrex271) - but we are pretty light on documentation on using it is.

https://github.com/tembo-io/pg_vectorize/blob/main/core/src/transformers/providers/ollama.rs

can you change this pub const OLLAMA_BASE_URL: &str = "http://localhost:3001"; to env variables and optional apikey ? so we can support other online ollama like compatible webservices

@ChuckHend
Copy link
Member Author

Yes it can be changed. It is a configuration is Postgres:

When you run this with docker-compose it gets set to the docker service name by default.

postgres=# show vectorize.ollama_service_url ;
   vectorize.ollama_service_url   
----------------------------------
 http://ollama-serve:3001/v1/chat
(1 row)

but it can be set to whatever value you want, e.g. ALTER SYSTEM SET vectorize.ollama_service_url to 'https://www.myservice.ai/embeddings'. The assumption here is that whatever service running at vectorize.ollama_service_url has an API (request and response) schema exactly like Ollama's.

Similarly, you can change the OpenAI url so long as the server running has the same API schema as OpenAI

postgres=# show vectorize.openai_service_url ;
 vectorize.openai_service_url 
------------------------------
 https://api.openai.com/v1
(1 row)

All the Postgres settings that can be changed are defined here: https://github.com/tembo-io/pg_vectorize/blob/main/extension/src/guc.rs

@tavallaie
Copy link

Let me try it

@ChuckHend
Copy link
Member Author

Let me know if you run into any issues. Can ping me here, or in the Tembo community slack.

Copy link

algora-pbc bot commented Oct 17, 2024

💎 $150 bounty • Tembo

Steps to solve:

  1. Start working: Comment /attempt #152 with your implementation plan
  2. Submit work: Create a pull request including /claim #152 in the PR body to claim the bounty
  3. Receive payment: 100% of the bounty is received 2-5 days post-reward. Make sure you are eligible for payouts

Thank you for contributing to tembo-io/pg_vectorize!

Add a bountyShare on socials

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
💎 Bounty enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants