Robert Oschler
unread,Dec 2, 2024, 4:05:00 PM12/2/24Sign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Chrome Built-in AI Early Preview Program Discussions
I rewrote my Chrome V3 extension code to create and destroy the Prompt API session each query. I did this, because when I had a prompt session object that was persistent, the LLM output clearly indicated it was "remembering" previous inputs. I'm guessing the default nature of the Prompt API model is chat based?
This code change worked, but the time to answer a query appears to take longer now. If I went back to using a persistent Prompt API session object, is there a way to make to make the LLM "forget" all previous interactions and treat the current query has the sum total of its input (i.e. - no chat history)?