For the built-in Prompt API, there is a documented per-prompt input limit of about 1,024 tokens, and a session context window of up to ~4,096 tokens.
If I have a single text input that exceeds 1,024 tokens (but is still under 4,096 tokens) and I want to use it as the initial prompt in a session, say in
const session = await LanguageModel.create({
initialPrompts: [
{ role: 'system', content: '…' },
{ role: 'user', content: {very long text} }
]
});
will the API allow me to pass that long text in one go (i.e., exceed the per-prompt limit) and still take advantage of the full 4,096-token session context window?