Local AI Integration

22 views
Skip to first unread message

Matthew Chisolm

unread,
Jun 12, 2024, 6:19:07 PMJun 12
to iterm2-discuss
First, thanks for making such a great terminal, making it open source and free, and focusing on privacy and security. 😁

I read in the release notes today when I updated the move of AI settings is, in part, intended to make it easier to use local models.

Did I mis-read or does that mean users are able to use a model installed locally and which does not send data? If that was the intent, I was unable to find it in settings, docs, or in this forum.

James Reynolds

unread,
Jun 13, 2024, 12:39:35 PMJun 13
to iterm2-...@googlegroups.com
It's the custom url. In my case I'm using Ollama.
Screenshot 2024-06-12 at 17.14.14.png
Reply all
Reply to author
Forward
0 new messages