You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ollama allows you to run an LLM model/service locally with minimal effort. This is can be especially important, for example, when demoing the project and derivative projects in a network-disconnected environment.
Implementation Considerations
The text was updated successfully, but these errors were encountered:
What?
Add Ollama as an LLM option.
Why?
Ollama allows you to run an LLM model/service locally with minimal effort. This is can be especially important, for example, when demoing the project and derivative projects in a network-disconnected environment.
Implementation Considerations
The text was updated successfully, but these errors were encountered: