LLM Context is a command-line tool that helps developers quickly paste relevant content from code / text projects into the web-chat interface of Large Language Models. It leverages .gitignore
patterns for smart file selection and systematically uses the clipboard for seamless integration with AI chats.
Note: This project was developed in collaboration with Claude-3.5-Sonnet, using LLM Context itself to share code context during development. All code in the repository is human-curated (by me 😇, @restlessronin).
For an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: LLM Context: Harnessing Vanilla AI Chats for Development
- LLM Integration: Primarily used with Claude (Project Knowledge) and GPT (Knowledge), but supports all LLM chat interfaces.
- Project Types: Suitable for code repositories and collections of text/markdown/html documents.
- Project Size: Optimized for projects that fit within an LLM's context window. Large project support is in development.
Use pipx to install LLM Context:
pipx install llm-context
LLM Context enables rapid project context updates for each AI chat.
- Install LLM Context if you haven't already.
- Navigate to your project's root directory.
- Run
lc-init
to set up LLM Context for your project (only needed once per repository). - For chat interfaces with built-in context storage (e.g., Claude Pro Projects, ChatGPT Plus GPTs):
- Set up your custom prompt manually in the chat interface.
- A default prompt is available in
.llm-context/templates/lc-prompt.md
.
- (Optional) Edit
.llm-context/config.toml
to add custom ignore patterns. - Run
lc-sel-files
to select files for full content inclusion. - (Optional) Review the selected file list in
.llm-context/curr_ctx.toml
. - Generate and copy the context:
- For chat interfaces with persistent context: Run
lc-context
- For other interfaces (including free plans): Run
lc-context --with-prompt
to include the default prompt
- For chat interfaces with persistent context: Run
- Paste the generated context:
- For interfaces with persistent context: Into the Project Knowledge (Claude Pro) or GPT Knowledge (ChatGPT Plus) section
- For other interfaces: Into the system message or the first chat message, as appropriate
- Start your conversation with the LLM about your project.
To maintain up-to-date AI assistance:
- Repeat steps 6-9 at the start of each new chat session. This process takes only seconds.
- For interfaces with persistent context, update your custom prompt separately if needed.
When the LLM asks for a file that isn't in the current context:
- Copy the LLM's file request (typically in a markdown block) to your clipboard.
- Run
lc-read-cliplist
to generate the content of the requested files. - Paste the generated file contents back into your chat with the LLM.
Add custom ignore patterns in .llm-context/config.toml
to exclude specific files or directories not covered by your project's .gitignore
. This is useful for versioned files that don't contribute to code context, such as media files, large generated files, detailed changelogs, or environment-specific configurations.
Example:
# /.llm-context/config.toml
[gitignores]
full_files = [
"*.svg",
"*.png",
"CHANGELOG.md",
".env",
# Add more patterns here
]
Review the list of selected files in .llm-context/curr_ctx.toml
to check what's included in the context. This is particularly useful when trying to minimize context size.
# /.llm-context/curr_ctx.toml
[context]
full = [
"/llm-context.py/pyproject.toml",
# more files ...
]
lc-init
: Initialize LLM Context for your project (only needed once per repository)lc-sel-files
: Select files for full content inclusionlc-sel-outlines
: Select files for outline inclusion (experimental)lc-context
: Generate and copy context to clipboard- Use
--with-prompt
flag to include the prompt for chat interfaces without persistent context
- Use
lc-read-cliplist
: Read contents for LLM-requested files, and copy to clipboard
For larger projects, we're exploring a combined approach of full file content and file outlines. Use lc-sel-outlines
after lc-sel-files
to experiment with this feature.
Note: The outlining feature currently supports the following programming languages: C, C++, C#, Elisp, Elixir, Elm, Go, Java, JavaScript, OCaml, PHP, Python, QL, Ruby, Rust, and TypeScript. Files in unsupported languages will not be outlined and will be excluded from the outline selection.
We welcome feedback, issue reports, and pull requests on our GitHub repository.
LLM Context evolves from a lineage of AI-assisted development tools:
- This project succeeds LLM Code Highlighter, a TypeScript library I developed for IDE integration.
- The concept originated from my work on RubberDuck and continued with later contributions to Continue.
- LLM Code Highlighter was heavily inspired by Aider Chat. I worked with GPT-4 to translate several Aider Chat Python modules into TypeScript, maintaining functionality while restructuring the code.
- This project uses tree-sitter tag query files from Aider Chat.
- LLM Context exemplifies the power of AI-assisted development, transitioning from Python to TypeScript and back to Python with the help of GPT-4 and Claude-3.5-Sonnet.
I am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.
I am grateful for the help of Claude-3.5-Sonnet in the development of this project.
This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.