>> Which also makes me think that if you're going to support local AI models you might want to have more preferences than just the url to the API. You might want to run them from a cli tool like llama.cpp or ollama, which would mean you'd need to be able to change the cli flags passed to the tools. This is probably a good reason to not support local AI models or anything via cli since it just means bloat and losing focus.
>
> At that point I think you could usefully employ a Shell Worksheet. :-)
Wow. I made a big long email saying that Shell Worksheets weren't interactive, and then I thought, what if I pressed control-return when it was waiting for user input? And it worked! It's not as nice as the ChatGPT worksheet, but it's still fun to play with and easier to work with than the Terminal, that's for sure.
```
llama.cpp -m /Users/james/Downloads/dolphin-2.6-mistral-7b-dpo-laser.Q6_K.gguf -i --interactive-first
*SKIPPING PAGES OF DEBUGGING OUTPUT*
You are an unhelpful AI assistant. You will answer briefly when asked questions. Sometimes you wont answer the question but you will tell the user random facts about space. Here is the first question.\
\
Question: Who was the first president of the United States of America?
Unhelpful but entertaining response: Did you know that there are over 200 billion stars in our galaxy, the Milky Way? That's more than enough to create a unique night sky for each person on Earth! Now let me remind you... The first president of the United States was George Washington.
```
If you have an m-series Mac, you can do this. Llama.cpp comes from here:
https://github.com/ggerganov/llama.cpp. In this case my llama.cpp is in /usr/local/bin and points to the `main` binary in the llama.cpp project. I got the model from here:
https://huggingface.co/TheBloke/dolphin-2.6-mistral-7B-dpo-laser-GGUF/tree/main.
One note. It looks like the default cwd is /. When I ran llama.cpp that way I got really strange output. I had to `cd ~` to fix it. Here's the weird output.
```
main -m /Users/james/Downloads/dolphin-2.6-mistral-7b-dpo-laser.Q6_K.gguf -p "What is your name?"
Failed to open logfile 'main.log' with error 'Read-only file system'
[1705425385] Log start
[1705425385] Cmd: /Users/james/.pkgx/
github.com/ggerganov/llama.cpp/v1645.0.0/bin/llama.cpp -m /Users/james/Downloads/dolphin-2.6-mistral-7b-dpo-laser.Q6_K.gguf -p "What is your name?"
***skipping several pages of debugging output***
[1705425387] embd_inp.size(): 6, n_consumed: 0
[1705425387] eval: [ '':1, ' What':1824, ' is':349, ' your':574, ' name':1141, '?':28804 ]
[1705425387] n_past = 6
[1705425387] sampled token: 13: '
'
[1705425387] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' What':1824, ' is':349, ' your':574, ' name':1141, '?':28804, '':13 ]
[1705425387] n_remain: -2
[1705425387] eval: [ '':13 ]
```
James Reynolds