Chat History Microservice¶
-The Chat History Microservice allows you to store, retrieve and manage chat conversations with a MongoDB database. This microservice can be used for data persistence in OPEA chat applications, enabling you to save and access chat histories.
-It can be integrated into any application by making HTTP requests to the provided API endpoints as shown in the flow diagram below.
- +📝 Chat History Microservice with MongoDB¶
+This README provides setup guides and all the necessary information about the Chat History microservice with MongoDB database.
+Setup Environment Variables¶
export http_proxy=${your_http_proxy}
@@ -773,6 +820,7 @@ Setup Environment Variables
🚀Start Microservice with Docker¶
@@ -784,23 +832,26 @@ Build Docker Image
Run Docker with CLI¶
-
-Run mongoDB image
-
+
+Run MongoDB image container
docker run -d -p 27017:27017 --name=mongo mongo:latest
-
-Run the chathistory Service
-
-docker run -d --name="chathistory-mongo-server" -p 6013:6013 -p 6012:6012 -p 6014:6014 -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e no_proxy=$no_proxy -e MONGO_HOST=${MONGO_HOST} -e MONGO_PORT=${MONGO_PORT} -e DB_NAME=${DB_NAME} -e COLLECTION_NAME=${COLLECTION_NAME} opea/chathistory-mongo-server:latest
+
+
Run the Chat History microservice
+docker run -d --name="chathistory-mongo-server" -p 6013:6013 -p 6012:6012 -p 6014:6014 -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e no_proxy=$no_proxy -e MONGO_HOST=${MONGO_HOST} -e MONGO_PORT=${MONGO_PORT} -e DB_NAME=${DB_NAME} -e COLLECTION_NAME=$ {COLLECTION_NAME} opea/chathistory-mongo-server:latest
+
+
+
-Invoke Microservice¶
-Once chathistory service is up and running, users can update the database by using the below API endpoint. The API returns a unique UUID for the saved conversation.
+✅ Invoke Microservice¶
+The Chat History microservice exposes the following API endpoints:
+
+Create new chat conversation
curl -X 'POST' \
http://${host_ip}:6012/v1/chathistory/create \
-H 'accept: application/json' \
@@ -812,9 +863,8 @@ Invoke Microservice}'
-
-Get all the Conversations for a user
-
+
+Get all the Conversations for a user
curl -X 'POST' \
http://${host_ip}:6012/v1/chathistory/get \
-H 'accept: application/json' \
@@ -823,9 +873,8 @@ Invoke Microservice "user": "test"}'
-
-Get specific conversation by specifying the id.
-
+
+Get a specific conversation by id.
curl -X 'POST' \
http://${host_ip}:6012/v1/chathistory/get \
-H 'accept: application/json' \
@@ -834,9 +883,8 @@ Invoke Microservice "user": "test", "id":"668620173180b591e1e0cd74"}'
-
-Update the conversation by specifying the id.
-
+
+Update the conversation by id.
curl -X 'POST' \
http://${host_ip}:6012/v1/chathistory/create \
-H 'accept: application/json' \
@@ -849,9 +897,8 @@ Invoke Microservice}'
-
-Delete a stored conversation by specifying the id.
-
+
+Delete a stored conversation.
curl -X 'POST' \
http://${host_ip}:6012/v1/chathistory/delete \
-H 'accept: application/json' \
@@ -860,6 +907,8 @@ Invoke Microservice "user": "test", "id":"668620173180b591e1e0cd74"}'
+
+
@@ -868,7 +917,7 @@ Invoke Microservice
- Previous
+ Previous
Next
@@ -877,7 +926,7 @@ Invoke Microservice
© Copyright 2024 OPEA™, a Series of LF Projects, LLC.
-Published on Sep 20, 2024.
+Published on Sep 24, 2024.
diff --git a/latest/GenAIComps/comps/cores/telemetry/README.html b/latest/GenAIComps/comps/cores/telemetry/README.html
index fb27a035a..7081e0976 100644
--- a/latest/GenAIComps/comps/cores/telemetry/README.html
+++ b/latest/GenAIComps/comps/cores/telemetry/README.html
@@ -29,7 +29,7 @@
-
+
@@ -98,7 +98,19 @@
-- Getting Started
+- Getting Started with OPEA
+
- GenAI Examples
- ChatQnA Sample Guide
- Overview
@@ -111,6 +123,12 @@
- Monitoring
+- ChatQnA Example Deployment Options
+
- Generative AI Examples
- Introduction
- Architecture
@@ -226,10 +244,10 @@
- ProductivitySuite Application
@@ -297,7 +315,8 @@
- Chathistory Microservice
- Cores Microservice
@@ -329,7 +348,8 @@
- Feedback_management Microservice
- Finetuning Microservice
@@ -339,9 +359,27 @@
- Guardrails Microservice
- Trust and Safety with LLM
- Bias Detection Microservice
+- Factuality Check Prediction Guard Microservice
+- 🚀 Start Microservice with Docker
+- 🚀 Consume Factuality Check Service
- Guardrails Microservice
- PII Detection Microservice
+- PII Detection Prediction Guard Microservice
+- 🚀 Start Microservice with Docker
+- 🚀 Consume PII Detection Service
+- Prompt Injection Detection Prediction Guard Microservice
+- 🚀 Start Microservice with Docker
+- 🚀 Consume Prompt Injection Detection Service
- Toxicity Detection Microservice
+- Toxicity Checking Prediction Guard Microservice
+- 🚀 Start Microservice with Docker
+- 🚀 Consume Toxicity Check Service
+
+
+- Image2video Microservice
- Intent_detection Microservice
@@ -378,7 +416,8 @@
- Prompt_registry Microservice
- Ragas Microservice
@@ -456,6 +495,7 @@
- Installation Guides
- GenAI-microservices-connector(GMC) Installation
+- Kubernetes Installation Options
- Kubernetes Installation using AWS EKS Cluster
- Kubernetes installation demo using kubeadm
- Kubernetes installation using Kubespray
@@ -570,6 +610,7 @@
- Setup Prometheus and Grafana to visualize microservice metrics
- StressCli
@@ -613,6 +654,13 @@
- Further References
+- OPEA adaption of ragas (LLM-as-a-judge evaluation of Retrieval Augmented Generation)
+
- Developer Guides
Visualize tracing
© Copyright 2024 OPEA™, a Series of LF Projects, LLC.
-Published on Sep 20, 2024.
+Published on Sep 24, 2024.
- ChatQnA Sample Guide
- Overview @@ -111,6 +123,12 @@
- Monitoring
+ - ChatQnA Example Deployment Options +
- Generative AI Examples
- Introduction
- Architecture @@ -226,10 +244,10 @@
- ProductivitySuite Application @@ -297,7 +315,8 @@
-
@@ -329,7 +348,8 @@
-
@@ -339,9 +359,27 @@
- Guardrails Microservice
- Trust and Safety with LLM
- Bias Detection Microservice +
- Factuality Check Prediction Guard Microservice +
- 🚀 Start Microservice with Docker +
- 🚀 Consume Factuality Check Service
- Guardrails Microservice
- PII Detection Microservice +
- PII Detection Prediction Guard Microservice +
- 🚀 Start Microservice with Docker +
- 🚀 Consume PII Detection Service +
- Prompt Injection Detection Prediction Guard Microservice +
- 🚀 Start Microservice with Docker +
- 🚀 Consume Prompt Injection Detection Service
- Toxicity Detection Microservice +
- Toxicity Checking Prediction Guard Microservice +
- 🚀 Start Microservice with Docker +
- 🚀 Consume Toxicity Check Service +
+ - Image2video Microservice
- Intent_detection Microservice
-
@@ -378,7 +416,8 @@
- Prompt_registry Microservice
- Ragas Microservice
-
@@ -456,6 +495,7 @@
- Installation Guides
- GenAI-microservices-connector(GMC) Installation +
- Kubernetes Installation Options
- Kubernetes Installation using AWS EKS Cluster
- Kubernetes installation demo using kubeadm
- Kubernetes installation using Kubespray @@ -570,6 +610,7 @@
- Setup Prometheus and Grafana to visualize microservice metrics
- StressCli
-
@@ -613,6 +654,13 @@
- Further References
+ - OPEA adaption of ragas (LLM-as-a-judge evaluation of Retrieval Augmented Generation) +
- Developer Guides
-
@@ -818,7 +866,7 @@
Dataprep Microservice with Multimodal
© Copyright 2024 OPEA™, a Series of LF Projects, LLC. -Published on Sep 20, 2024. +Published on Sep 24, 2024.