I need to start pulling together some FAQ's for this, so I've used your questions to help me create this first draft.
Which of the "small" language models require WebGPU? All of them
What web browsers support WebGPU?
A good list for this is maintained at
https://caniuse.com/webgpu - identifying not just which browsers, but specifically which versions of which browsers. I've found it to be a generally reliable source of information for this sort of thing.
Which of the "small" language models require the shade-f16 WebGPU extension?
These five models do require the shade-f16 WebGPU extension: SmolLM2-135M-Instruct-q0f16-MLC, TinyLlama-1.1B-Chat-v1.0-q4f16_1-MLC-1k, phi-1_5-q4f16_1-MLC, gemma-2-2b-it-q4f16_1-MLC-1k, RedPajama-INCITE-Chat-3B-v1-q4f16_1-MLC-1k
These three models do not require the shade-f16 WebGPU extension: Qwen2.5-0.5B-Instruct-q4f16_1-MLC, Llama-3.2-1B-Instruct-q4f16_1-MLC, stablelm-2-zephyr-1_6b-q4f16_1-MLC-1k
Can you add a few more "small" language models that don't require the shade-f16 WebGPU extension?
That's on my todo list - I'll add some f32 variants that I can swap in for browsers that don't have f16 support.
What web browsers support the shade-f16 WebGPU extension?
A good list for this is maintained at
https://caniuse.com/mdn-api_gpusupportedfeatures_feature_shader-f16 - identifying not just which browsers, but specifically which versions of which browsers. I've found it to be a generally reliable source of information for this sort of thing. Sometimes this will prompt you to start your browser with a particular flag to enable support that isn't turned on by default.
How can I get specific information about what my particular web browser on my particular computer supports?
Kind regards
D