Skip to content

Commit

Permalink
cleanup kserve docs
Browse files Browse the repository at this point in the history
  • Loading branch information
dtrifiro committed Oct 20, 2023
1 parent 1c12ac0 commit b69f9f4
Showing 1 changed file with 23 additions and 24 deletions.
47 changes: 23 additions & 24 deletions demo/kserve/built-tip.md
Original file line number Diff line number Diff line change
@@ -1,39 +1,38 @@
# Bootstrap process (optional)

Caikit-TGIS image always requires a Caikit-formatted model. Below are the instructions from converting a model to a Caikit-formatted one.
Caikit requires equires a Caikit-formatted model. Below are the instructions from converting a model to a Caikit-formatted one.

1. Clone the model repository (or have the model folder in a directory). In the below example, Bloom-560m model repo is cloned.
~~~

```bash
yum -y install git git-lfs
git lfs install
git clone https://huggingface.co/bigscience/bloom-560m
~~~
```

2. Clone the caikit-nlp repo:
~~~
git clone https://github.com/caikit/caikit-nlp.git
~~~
2. Create a virtual environment with Python 3.9 and install `caikit-nlp`

3. Create a virtual environment with Python 3.9 and install the caikit-nlp
~~~
python3 -m virtualenv -p python3.9 venv
```bash
python3 -m venv -p python3.9 venv
source venv/bin/activate
python3.9 -m pip install ./caikit-nlp
~~~
python3.9 -m pip install git+https://github.com/caikit/caikit-nlp.git
```

3. (Optional) Clone the `caikit-tgis-serving` repo, if not already available.

4. (Optional) Clone the caikit-tgis-serving repo, if not already available.
~~~
```bash
git clone https://github.com/opendatahub-io/caikit-tgis-serving.git
~~~
```

4. Invoke the conversion script located in /utils of caikit-tgis-serving repo.

```bash
caikit-tgis-serving/utils/convert.py --model-path ./bloom-560m/ --model-save-path ./bloom-560m-caikit
```

5. Invoke the conversion script located in /utils of caikit-tgis-serving repo.
~~~
cp caikit-tgis-serving/utils/convert.py .
./convert.py --model-path ./bloom-560m/ --model-save-path ./bloom-560m-caikit
~~~
5. Move the model folder (ie. `/bloom-560m-caikit`) into desired storage (ie. S3, MinIO, PVC or other)
6. Do **not** include the model folder name/directory directly in `InferenceService`, but rather point to the directory where the model folder is located. Let's say the `bloom-560m-caikit` directory is located at: `example-models/llm/models/bloom-560m-caikit/`, then `storageUri` value in the InferenceService CR should look like:

6. Move the model folder (ie. `/bloom-560m-caikit`) into desired storage (ie. S3, MinIO, PVC or other)
7. Do *not* include the model folder name/directory directly in InferenceService, but rather point to the directory where the model folder is located. Let's say the `bloom-560m-caikit` directory is located at: `example-models/llm/models/bloom-560m-caikit/`, then `storageUri` value in the InferenceService CR should look like:
~~~
```bash
storageUri: s3://example-models/llm/models
~~~
```

0 comments on commit b69f9f4

Please sign in to comment.