Local LLM support

51 views
Skip to first unread message

James Reynolds

unread,
May 21, 2024, 5:34:57 PMMay 21
to iterm2-discuss
It would be great if iTerm2 could specify a local llm like Ollama, which has an OpenAI compatible API. https://github.com/ollama/ollama/blob/main/docs/api.md and https://ollama.com/blog/openai-compatibility.

Khalil Gharbaoui

unread,
May 27, 2024, 3:41:51 PMMay 27
to iterm2-discuss
This is indeed very good to have as we don't want to send all our data to OpenAI
But besides that it would be just great to pick our own LLM flavor

George Nachman

unread,
May 28, 2024, 1:54:25 AMMay 28
to iterm2-...@googlegroups.com
Please try the latest beta version. It moves the URL setting into the main settings panel (was in Advanced before). This should make it easier to configure a local LLM, but let me know if more is needed.

--
You received this message because you are subscribed to the Google Groups "iterm2-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to iterm2-discus...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/iterm2-discuss/898529fe-79b5-4902-86f4-5692db284984n%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages