-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sample: Dozer + LLM + Vector database + Langchain sample #1690
Comments
/bounty $250 |
💎 $250 bounty • Dozer DataSteps to solve:
Additional opportunities:
Thank you for contributing to getdozer/dozer! Add a bounty • Share on socials
|
/attempt #1690 |
@drrosa you can create a PR on dozer-samples repo |
Okay, I'll do that. Thanks! |
Note: The user @drrosa is already attempting to complete issue #1690 and claim the bounty. If you attempt to complete the same issue, there is a chance that @drrosa will complete the issue first, and be awarded the bounty. We recommend discussing with @drrosa and potentially collaborating on the same solution versus creating an alternate solution. |
/attempt #1690 Options |
Hi @drrosa and @snork-alt is this issue is still open? |
Hi @snork-alt Is the bounty is still live? Can you please assign the issue to me? |
@snork-alt is this issue still active? |
A few days ago we published an article (https://getdozer.io/blog/llm-chatbot) describing how Dozer could improve hyper-personalization when used together with LLMs, Vector databases, and Langchain.
The article describes a hypothetical bank implementing an LLM-based chatbot and leveraging Dozer to create a unified customer profile and later on passing it to an LLM as a context to hyper-personalize the chatbot.
Based on this article, a complete working sample must be produced. Dozer should be configured to source from multiple datasets (like customer profiles, transactions, etc) and Dozer APIs to be integrated with langchain. A similar use case (as described in the article) of credit card products should be built.
The text was updated successfully, but these errors were encountered: