Seems relevant given the recent discussion

5 views
Skip to first unread message

Alan Karp

unread,
Jan 7, 2026, 8:59:03 PM (4 days ago) Jan 7
to <friam@googlegroups.com>

William ML Leslie

unread,
Jan 7, 2026, 9:21:42 PM (4 days ago) Jan 7
to Design

Raoul Duke

unread,
Jan 7, 2026, 9:23:39 PM (4 days ago) Jan 7
to fr...@googlegroups.com
Might be cool if all those things could be hooked up to a SOTA LLM/AI
so I can have a conversational way of learning wtf is happening in my
program?
> --
> You received this message because you are subscribed to the Google Groups "friam" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to friam+un...@googlegroups.com.
> To view this discussion visit https://groups.google.com/d/msgid/friam/CAHgd1hE_Y8qVSMwCOpu6yFA6%2B_LNVSEh1g%3D-9ZJSB6AD2kCvxQ%40mail.gmail.com.

William ML Leslie

unread,
Jan 7, 2026, 9:35:13 PM (4 days ago) Jan 7
to fr...@googlegroups.com
On Thu, 8 Jan 2026 at 12:23, Raoul Duke <rao...@gmail.com> wrote:
Might be cool if all those things could be hooked up to a SOTA LLM/AI
so I can have a conversational way of learning wtf is happening in my
program?

LLM use cases are the ultimate use case for the hotline bling meme.

image.png


--
William ML Leslie

William ML Leslie

unread,
Jan 7, 2026, 10:42:20 PM (4 days ago) Jan 7
to fr...@googlegroups.com
I'm terribly sorry, I didn't think this was funny enough.  I've illustrated the 2023+ UX meta below.  HTH.
Reply all
Reply to author
Forward
0 new messages