diff --git a/docs/docs.yml b/docs/docs.yml
index b3cd11885..90f0fa7a3 100644
--- a/docs/docs.yml
+++ b/docs/docs.yml
@@ -67,6 +67,8 @@ navigation:
contents:
- page: Overview
path: docs/snippets/clients/overview.mdx
+ - page: Accordion
+ path: docs/snippets/clients/accordion.mdx
- section: providers
contents:
- page: anthropic
diff --git a/docs/docs/snippets/clients/accordion.mdx b/docs/docs/snippets/clients/accordion.mdx
new file mode 100644
index 000000000..6a364b024
--- /dev/null
+++ b/docs/docs/snippets/clients/accordion.mdx
@@ -0,0 +1,46 @@
+---
+slug: docs/snippets/clients/accordion
+---
+
+This is a placeholder page - do not link to it, do not use it. Am using it to try out AccordionGroup for our providers.
+
+## Providers
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/docs/docs/snippets/clients/overview.mdx b/docs/docs/snippets/clients/overview.mdx
index b850b4851..64ba4d081 100644
--- a/docs/docs/snippets/clients/overview.mdx
+++ b/docs/docs/snippets/clients/overview.mdx
@@ -1,5 +1,5 @@
---
-slug: docs/snippets/clients
+slug: docs/snippets/clients/overview
---
Clients are used to configure how LLMs are called, like so:
@@ -13,13 +13,13 @@ function MakeHaiku(topic: string) -> string {
}
```
-This is `provider/model` shorthand for:
+This is `/` shorthand for:
```rust BAML
client MyClient {
- provider openai
+ provider "openai"
options {
- model gpt-4o
+ model "gpt-4o"
// api_key defaults to env.OPENAI_API_KEY
}
}
diff --git a/docs/docs/snippets/clients/providers/huggingface.mdx b/docs/docs/snippets/clients/providers/huggingface.mdx
index 5ba1f83c9..dae4d3e29 100644
--- a/docs/docs/snippets/clients/providers/huggingface.mdx
+++ b/docs/docs/snippets/clients/providers/huggingface.mdx
@@ -6,7 +6,7 @@ See https://huggingface.co/docs/inference-endpoints/index for more information o
```baml BAML
client MyClient {
- provider openai
+ provider openai-generic
options {
base_url "https://api-inference.huggingface.co/v1"
api_key env.HUGGINGFACE_API_KEY
diff --git a/docs/docs/snippets/clients/providers/ollama.mdx b/docs/docs/snippets/clients/providers/ollama.mdx
index 5cff5a2cc..2723c327e 100644
--- a/docs/docs/snippets/clients/providers/ollama.mdx
+++ b/docs/docs/snippets/clients/providers/ollama.mdx
@@ -3,7 +3,6 @@ title: ollama
slug: docs/snippets/clients/providers/ollama
---
-
For `ollama`, we provide a client that can be used to interact with [ollama](https://ollama.com/) `/chat/completions` endpoint.
What is ollama? Ollama is an easy way to run LLMs locally!
diff --git a/docs/docs/snippets/clients/providers/together.mdx b/docs/docs/snippets/clients/providers/together.mdx
index f7be46114..a7aae74a7 100644
--- a/docs/docs/snippets/clients/providers/together.mdx
+++ b/docs/docs/snippets/clients/providers/together.mdx
@@ -1,14 +1,17 @@
-
+---
+title: together
+slug: docs/snippets/clients/providers/together
+---
https://www.together.ai/ - The fastest cloud platform for building and running generative AI.
-Together AI supports the OpenAI client, allowing you to use the [openai](/docs/snippets/clients/providers/openai) provider with an overriden `base_url`
+Together AI supports the OpenAI client, allowing you to use the [openai-generic](/docs/snippets/clients/providers/openai-generic) provider with an overriden `base_url`
See https://docs.together.ai/docs/openai-api-compatibility for more information.
```baml BAML
client MyClient {
- provider openai
+ provider openai-generi
options {
base_url "https://api.together.ai/v1"
api_key env.TOGETHER_API_KEY
diff --git a/docs/docs/snippets/clients/providers/vllm.mdx b/docs/docs/snippets/clients/providers/vllm.mdx
index d589ac56a..45f5426e5 100644
--- a/docs/docs/snippets/clients/providers/vllm.mdx
+++ b/docs/docs/snippets/clients/providers/vllm.mdx
@@ -7,7 +7,7 @@ See https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html for mor
```baml BAML
client MyClient {
- provider openai
+ provider openai-generic
options {
base_url "http://localhost:8000/v1"
api_key "token-abc123"