LLama response exceeds 'Say' response capacity

11 views
Skip to first unread message

Ilias Birdas

unread,
May 10, 2025, 3:45:06 AMMay 10
to Machine Learning for Kids
I am using the medium size Llama model to experiment with AI answers.
One problem I came across is that the responses the AI model provides some times are lengthy and do not fit in the space that SAY command creates; They end up being cut abruptly.
I believe its some kind of native limit Scratch has on the supported size (?)

Any ideas how to overcome this?
a) any way to instruct the AI model to keep the answer short, less than the 300 char limit?
b) any hacks how to have bigger texts within scratch?

Kind Regards
Ilias

Dale Lane

unread,
May 10, 2025, 3:46:18 AMMay 10
to Machine Learning for Kids
If you look at the "Story teller" worksheet from https://machinelearningforkids.co.uk/worksheets you can see an example of how to handle this. 

In short, yes - you can include instructions to keep the answer short in your prompt

Kind regards

D

Reply all
Reply to author
Forward
0 new messages