prefix behavior has changed?

22 views
Skip to first unread message

Die4Ever2005

unread,
Dec 22, 2025, 2:32:30โ€ฏPMย (3 days ago)ย Dec 22
to Chrome Built-in AI Early Preview Program Discussions
let m = await LanguageModel.create();
await m.prompt([{role:'user', content: 'Hello'},{role:'assistant', content: 'Hello', prefix:true}]);

"Hi there! ๐Ÿ‘‹ ย Glad to hear from you. ย Is there anything specific you'd like to talk about or need help with? I'm happy to answer questions, generate ideas, write stories, or just chat. ย \n\nLet me know! ๐Ÿ˜Š\n\n\n\n"

If I prepend the prefix text, which is what I used to do, then it would look like "HelloHi there!"

Die4Ever2005

unread,
Dec 22, 2025, 2:34:11โ€ฏPMย (3 days ago)ย Dec 22
to Chrome Built-in AI Early Preview Program Discussions, Die4Ever2005
Version 144.0.7559.31 (Official Build) beta (64-bit)

Foundational model state: Ready
Model Name: v3Nano
Version: 2025.08.14.1358
Backend Type: CPU
File path: /home/ray/.config/google-chrome-beta/OptGuideOnDeviceModel/2025.8.21.1028
Folder size: 4,236.82 MiB

Die4Ever2005

unread,
Dec 22, 2025, 2:36:06โ€ฏPMย (3 days ago)ย Dec 22
to Chrome Built-in AI Early Preview Program Discussions, Die4Ever2005

I tried it again and similar result.

let m=await LanguageModel.create(); await m.prompt([{role:'user', content: 'Hello'},{role:'assistant', content: 'Hello', prefix:true}]);

"Hi there! ๐Ÿ˜Š \n\nHow are you doing today? ย Is there anything I can help you with? ย \n\nI'm ready for anything you might ask or discuss! ย Whether you need information, creative writing, help brainstorming, or just a friendly chat, I'm here. ๐Ÿ˜Š\n\n\n\n"

chrome://on-device-internals/ says:

2:34:47 PM ai_language_model.cc(544) Starting on-device session for PromptApi
2:34:48 PM ai_language_model.cc(254) Executing model with input context of 14 tokens: <user>Hello<end><model>Hello
2:34:56 PM ai_language_model.cc(364) Model generates raw response with PromptApi: Hi there! ๐Ÿ˜Š How are you doing today? Is there anything I can help you with? I'm ready for anything you might ask or discuss! Whether you need information, creative writing, help brainstorming, or just a friendly chat, I'm here. ๐Ÿ˜Š

Die4Ever2005

unread,
Dec 22, 2025, 11:04:13โ€ฏPMย (2 days ago)ย Dec 22
to Chrome Built-in AI Early Preview Program Discussions, Die4Ever2005
Interesting, it works fine on my desktop computer.

let m = await LanguageModel.create(); await m.prompt([{role:'user',content:'Hello'},{role:'assistant',content:'Hello',prefix:true}]);
23:02:28.582 " there! ๐Ÿ‘‹\n\nHow can I help you today? I'm ready to answer your questions, chat, brainstorm ideas, or anything else I can do. ๐Ÿ˜Š \n\nJust let me know what's on your mind!"
23:02:41.360 let m = await LanguageModel.create(); await m.prompt([{role:'user',content:'Hello'},{role:'assistant',content:'Hello',prefix:true}]);
23:02:43.075 " there! ๐Ÿ‘‹\n\nIt's nice to hear from you. How can I help you today? ย Do you have any questions, need some information, or just want to chat? ๐Ÿ˜Š \n\nI'm ready for anything! Just let me know.\n\n\n\n"
23:02:49.281 let m = await LanguageModel.create(); await m.prompt([{role:'user',content:'Hello'},{role:'assistant',content:'Hello',prefix:true}]);
23:02:51.083 " there! ๐Ÿ‘‹ \n\nIt's nice to hear from you. How can I help you today? ๐Ÿ˜Š \n\nI'm ready to answer questions, generate text, translate languages, write stories, or just chat. Just let me know what you have in mind! \n\n"


Version 144.0.7559.20 (Official Build) beta (64-bit)

Foundational model state: Ready
Model Name: v3Nano
Version: 2025.06.30.1229
Backend Type: GPU (highest quality)
File path: /home/ray/.config/google-chrome-beta/OptGuideOnDeviceModel/2025.8.8.1141
Folder size: 4,088.14 MiB

Die4Ever2005

unread,
Dec 22, 2025, 11:05:28โ€ฏPMย (2 days ago)ย Dec 22
to Chrome Built-in AI Early Preview Program Discussions, Die4Ever2005
My desktop is using an older model, OptGuideOnDeviceModel/2025.8.8.1141

Maybe it failed to update due to the Linux issue with the size of /tmp and I haven't been starting Chrome with the custom TMPDIR envvar.
Reply all
Reply to author
Forward
0 new messages