CustomLLM config to leverage watsonx LLMs with continue.dev.
-
Updated
Aug 27, 2024 - TypeScript
CustomLLM config to leverage watsonx LLMs with continue.dev.
Storing code used in Generative AI Developer Guides on the IBM Developer Website
Helping beekeepers save their bees.
Simple implementation of RAG using watsonx.ai, capturing the chat history to keep track of the conversation context and answer follow up questions.
This repository showcases a comprehensive approach to information retrieval, document re-ranking, and language model integration. It incorporates techniques such as document chunking, embedding projection, and automatic query expansion to enhance the effectiveness of information retrieval systems.
A repository to store piece of code...
A deployable architecture solution to deploy IBM Watsonx SaaS resources.
This project contains a simple example implementation for a simple question-answering pipeline using inside-search (IBM Cloud Watson Discovery) and prompt (IBM watsonx with prompt-lab).
This is an example using the langchain_ibm implementation for function calling a LLM model running in watsonx.
A deployable architecture that automates the deployment of a sample gen AI Pattern on IBM Cloud, including all underlying IBM Cloud and WatsonX infrastructure.
Game Builder uses AI agents to generate, review, and evaluate Python game code, ensuring high-quality and functional game development.
Machine Learning sample assets, notebooks and apps.
Add a description, image, and links to the watsonx topic page so that developers can more easily learn about it.
To associate your repository with the watsonx topic, visit your repo's landing page and select "manage topics."