Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: make dynamic clients discoverable #837

Open
wants to merge 8 commits into
base: canary
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 1 addition & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -100,14 +100,12 @@ $RECYCLE.BIN/
.stylelintcache
.temp
.tern-port
.turbo!*
.turbo
.venv
.venv/# Created by pytest automatically.
.vercel
.vscode-test
.vscode-test/
.vscode/
.vscode/*
.vuepress/dist
.yarn-integrity
.yarn/*
Expand Down
12 changes: 6 additions & 6 deletions docs/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,22 +71,22 @@ navigation:
path: docs/snippets/clients/providers/azure.mdx
- page: google-ai
path: docs/snippets/clients/providers/gemini.mdx
- page: groq
path: docs/snippets/clients/providers/groq.mdx
- page: huggingface
path: docs/snippets/clients/providers/huggingface.mdx
- page: ollama
path: docs/snippets/clients/providers/ollama.mdx
- page: openai
path: docs/snippets/clients/providers/openai.mdx
- page: vertex-ai
path: docs/snippets/clients/providers/vertex.mdx
- page: openrouter
path: docs/snippets/clients/providers/openrouter.mdx
- page: together-ai
path: docs/snippets/clients/providers/together.mdx
- page: groq
path: docs/snippets/clients/providers/groq.mdx
- page: vertex-ai
path: docs/snippets/clients/providers/vertex.mdx
- page: vllm
path: docs/snippets/clients/providers/vllm.mdx
- page: huggingface
path: docs/snippets/clients/providers/huggingface.mdx
- section: provider strategies
contents:
- page: fallback
Expand Down
12 changes: 8 additions & 4 deletions docs/docs/snippets/client-constructor.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,17 @@ The configuration modifies the URL request BAML runtime makes.

| Provider Name | Docs | Notes |
| -------------- | -------------------------------- | ---------------------------------------------------------- |
| `openai` | [OpenAI](/docs/snippets/clients/providers/openai) | Anything that follows openai's API exactly |
| `ollama` | [Ollama](/docs/snippets/clients/providers/ollama) | Alias for an openai client but with default ollama options |
| `azure-openai` | [Azure OpenAI](/docs/snippets/clients/providers/azure) | |
| `anthropic` | [Anthropic](/docs/snippets/clients/providers/anthropic) | |
| `aws-bedrock` | [AWS Bedrock](/docs/snippets/clients/providers/aws-bedrock) | |
| `azure-openai` | [Azure OpenAI](/docs/snippets/clients/providers/azure) | |
| `google-ai` | [Google AI](/docs/snippets/clients/providers/gemini) | |
| `ollama` | [Ollama](/docs/snippets/clients/providers/ollama) | Alias for an OpenAI client but with default ollama options |
| `openai` | [OpenAI](/docs/snippets/clients/providers/openai) | Anything that follows OpenAI's API exactly |
| `vertex-ai` | [Vertex AI](/docs/snippets/clients/providers/vertex) | |
| `aws-bedrock` | [AWS Bedrock](/docs/snippets/clients/providers/aws-bedrock) | |

We also have some special providers that allow composing clients together:
| Provider Name | Docs | Notes |
| -------------- | -------------------------------- | ---------------------------------------------------------- |
| `fallback` | [Fallback](/docs/snippets/clients/fallback) | Used to chain models conditional on failures |
| `round-robin` | [Round Robin](/docs/snippets/clients/round-robin) | Used to load balance |

Expand Down
10 changes: 9 additions & 1 deletion docs/docs/snippets/clients/overview.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
slug: docs/snippets/clients/overview
slug: docs/snippets/clients
---

Clients are used to configure how LLMs are called.
Expand Down Expand Up @@ -27,6 +27,14 @@ function MakeHaiku(topic: string) -> string {
}
```

<Tip>
If you want to specify which client to use at runtime, in your Python/TS/Ruby code,
you can use the [client registry](/docs/calling-baml/client-registry) to do so.

This can come in handy if you're trying to, say, send 10% of your requests to a
different model.
</Tip>

## Fields

<Markdown src="../client-constructor.mdx" />
Expand Down
5 changes: 2 additions & 3 deletions engine/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions engine/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ baml-types = { path = "baml-lib/baml-types" }
internal-baml-codegen = { path = "language_client_codegen" }
internal-baml-core = { path = "baml-lib/baml-core" }
internal-baml-jinja = { path = "baml-lib/jinja" }
internal-baml-schema-ast = { path = "baml-lib/schema-ast" }

[workspace.package]
version = "0.52.1"
Expand Down
8 changes: 4 additions & 4 deletions engine/baml-fmt/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,13 @@ description.workspace = true
license-file.workspace = true

[dependencies]
colored = "2"
baml-lib = { path = "../baml-lib/baml" }
internal-baml-jinja = { path = "../baml-lib/jinja" }
anyhow.workspace = true

indoc.workspace = true
internal-baml-core.workspace = true
internal-baml-schema-ast.workspace = true
serde_json.workspace = true
serde.workspace = true
indoc.workspace = true
lsp-types = "0.91.1"
log = "0.4.14"
enumflags2 = "0.7"
Expand Down
16 changes: 0 additions & 16 deletions engine/baml-fmt/build.rs

This file was deleted.

238 changes: 119 additions & 119 deletions engine/baml-fmt/src/lib.rs
Original file line number Diff line number Diff line change
@@ -1,119 +1,119 @@
#![allow(dead_code)]

mod validate;

use baml_lib::{internal_baml_parser_database::ast, SourceFile};

use lsp_types::{Position, Range};

pub fn call_llm(schema: String) -> String {
schema
}

pub fn validate(validate_params: String) -> Result<(), String> {
validate::validate(&validate_params)
}

/// The LSP position is expressed as a (line, col) tuple, but our pest-based parser works with byte
/// offsets. This function converts from an LSP position to a pest byte offset. Returns `None` if
/// the position has a line past the end of the document, or a character position past the end of
/// the line.
pub(crate) fn position_to_offset(position: &Position, document: &str) -> Option<usize> {
let mut offset = 0;
let mut line_offset = position.line;
let mut character_offset = position.character;
let mut chars = document.chars();

while line_offset > 0 {
loop {
match chars.next() {
Some('\n') => {
offset += 1;
break;
}
Some(_) => {
offset += 1;
}
None => return Some(offset),
}
}

line_offset -= 1;
}

while character_offset > 0 {
match chars.next() {
Some('\n') | None => return Some(offset),
Some(_) => {
offset += 1;
character_offset -= 1;
}
}
}

Some(offset)
}

#[track_caller]
/// Converts an LSP range to a span.
pub(crate) fn range_to_span(range: Range, document: &str) -> ast::Span {
let start = position_to_offset(&range.start, document).unwrap();
let end = position_to_offset(&range.end, document).unwrap();

ast::Span::new(
SourceFile::from(("<unknown>".into(), "contents".to_string())),
start,
end,
)
}

/// Gives the LSP position right after the given span.
pub(crate) fn position_after_span(span: ast::Span, document: &str) -> Position {
offset_to_position(span.end - 1, document)
}

/// Converts a byte offset to an LSP position, if the given offset
/// does not overflow the document.
pub fn offset_to_position(offset: usize, document: &str) -> Position {
let mut position = Position::default();

for (i, chr) in document.chars().enumerate() {
match chr {
_ if i == offset => {
return position;
}
'\n' => {
position.character = 0;
position.line += 1;
}
_ => {
position.character += 1;
}
}
}

position
}

#[cfg(test)]
mod tests {
use lsp_types::Position;

// On Windows, a newline is actually two characters.
#[test]
fn position_to_offset_with_crlf() {
let schema = "\r\nmodel Test {\r\n id Int @id\r\n}";
// Let's put the cursor on the "i" in "id Int".
let expected_offset = schema.chars().position(|c| c == 'i').unwrap();
let found_offset = super::position_to_offset(
&Position {
line: 2,
character: 4,
},
schema,
)
.unwrap();

assert_eq!(found_offset, expected_offset);
}
}
// #![allow(dead_code)]

// mod validate;

// use baml_lib::{internal_baml_parser_database::ast, SourceFile};

// use lsp_types::{Position, Range};

// pub fn call_llm(schema: String) -> String {
// schema
// }

// pub fn validate(validate_params: String) -> Result<(), String> {
// validate::validate(&validate_params)
// }

// /// The LSP position is expressed as a (line, col) tuple, but our pest-based parser works with byte
// /// offsets. This function converts from an LSP position to a pest byte offset. Returns `None` if
// /// the position has a line past the end of the document, or a character position past the end of
// /// the line.
// pub(crate) fn position_to_offset(position: &Position, document: &str) -> Option<usize> {
// let mut offset = 0;
// let mut line_offset = position.line;
// let mut character_offset = position.character;
// let mut chars = document.chars();

// while line_offset > 0 {
// loop {
// match chars.next() {
// Some('\n') => {
// offset += 1;
// break;
// }
// Some(_) => {
// offset += 1;
// }
// None => return Some(offset),
// }
// }

// line_offset -= 1;
// }

// while character_offset > 0 {
// match chars.next() {
// Some('\n') | None => return Some(offset),
// Some(_) => {
// offset += 1;
// character_offset -= 1;
// }
// }
// }

// Some(offset)
// }

// #[track_caller]
// /// Converts an LSP range to a span.
// pub(crate) fn range_to_span(range: Range, document: &str) -> ast::Span {
// let start = position_to_offset(&range.start, document).unwrap();
// let end = position_to_offset(&range.end, document).unwrap();

// ast::Span::new(
// SourceFile::from(("<unknown>".into(), "contents".to_string())),
// start,
// end,
// )
// }

// /// Gives the LSP position right after the given span.
// pub(crate) fn position_after_span(span: ast::Span, document: &str) -> Position {
// offset_to_position(span.end - 1, document)
// }

// /// Converts a byte offset to an LSP position, if the given offset
// /// does not overflow the document.
// pub fn offset_to_position(offset: usize, document: &str) -> Position {
// let mut position = Position::default();

// for (i, chr) in document.chars().enumerate() {
// match chr {
// _ if i == offset => {
// return position;
// }
// '\n' => {
// position.character = 0;
// position.line += 1;
// }
// _ => {
// position.character += 1;
// }
// }
// }

// position
// }

// #[cfg(test)]
// mod tests {
// use lsp_types::Position;

// // On Windows, a newline is actually two characters.
// #[test]
// fn position_to_offset_with_crlf() {
// let schema = "\r\nmodel Test {\r\n id Int @id\r\n}";
// // Let's put the cursor on the "i" in "id Int".
// let expected_offset = schema.chars().position(|c| c == 'i').unwrap();
// let found_offset = super::position_to_offset(
// &Position {
// line: 2,
// character: 4,
// },
// schema,
// )
// .unwrap();

// assert_eq!(found_offset, expected_offset);
// }
// }
Loading
Loading