WebGPU for Language model

9 views
Skip to first unread message

Nicolas C.

unread,
Dec 9, 2025, 9:32:22 AMDec 9
to Machine Learning for Kids
Hi,

Thank you so much for this amazing ressource!

I encountered a crash when trying the Language Model activity:

Error message: in part 2, when trying to download the model architecture, I get the following errors, depending on the browser that I tried:

- Windows 10, Chrome v143.0.7499.40
"This model requires webgpu extension shade-f16, which is not enabled in this browser. You can try to launch Chrome Canary in the command line with flag --enable-dawn-features=allow_unsafe_apis which is not yet supported by this browser"

- Safari on iPadOS 18.6.2
"WebGPU is not enabled in your browser"

Is there a fix for me to enable this required feature, ideally for both Windows amd MacOS?

Thanks a lot.

Sincerely,

Nicolas

Dale Lane

unread,
Dec 9, 2025, 9:58:48 AMDec 9
to Machine Learning for Kids
Hiya

Thanks for the questions, and sorry that you've run into problems with this. I don't own a Windows computer, so testing on Windows is a bit of a blind spot for me - I do check docs to try and ensure that I write things in a way that should work cross-platform, but sometimes there are nuances that I miss by not being able to do more actual hands-on testing. 

I need to start pulling together some FAQ's for this, so I've used your questions to help me create this first draft. 

Which of the "small" language models require WebGPU? 
All of them

What web browsers support WebGPU? 
A good list for this is maintained at https://caniuse.com/webgpu - identifying not just which browsers, but specifically which versions of which browsers. I've found it to be a generally reliable source of information for this sort of thing.

Which of the "small" language models require the shade-f16 WebGPU extension? 
These five models do require the shade-f16 WebGPU extension: SmolLM2-135M-Instruct-q0f16-MLC, TinyLlama-1.1B-Chat-v1.0-q4f16_1-MLC-1k, phi-1_5-q4f16_1-MLC, gemma-2-2b-it-q4f16_1-MLC-1k, RedPajama-INCITE-Chat-3B-v1-q4f16_1-MLC-1k
These three models do not require the shade-f16 WebGPU extension: Qwen2.5-0.5B-Instruct-q4f16_1-MLC, Llama-3.2-1B-Instruct-q4f16_1-MLC, stablelm-2-zephyr-1_6b-q4f16_1-MLC-1k

Can you add a few more "small" language models that don't require the shade-f16 WebGPU extension? 
That's on my todo list - I'll add some f32 variants that I can swap in for browsers that don't have f16 support. 

What web browsers support the shade-f16 WebGPU extension? 
A good list for this is maintained at https://caniuse.com/mdn-api_gpusupportedfeatures_feature_shader-f16 - identifying not just which browsers, but specifically which versions of which browsers. I've found it to be a generally reliable source of information for this sort of thing. Sometimes this will prompt you to start your browser with a particular flag to enable support that isn't turned on by default.

How can I get specific information about what my particular web browser on my particular computer supports? 
https://webgpureport.org is good for this


Kind regards

D


Nicolas C.

unread,
Dec 9, 2025, 10:49:30 AMDec 9
to Machine Learning for Kids
Awesome - many thanks!

Sincerely,

Nicolas

Reply all
Reply to author
Forward
0 new messages