https://github.com/immersive-web/webxr-hand-input/blob/master/explainer.md
https://immersive-web.github.io/webxr-hand-input/
Yes
https://immersive-web.github.io/webxr-hand-input/ Exposes hands as an XrInputSource for use during a WebXr session. When hands are detected, this allows the API to expose them as the "hand-tracking" type, which can also allow exposing a fully-articulated hand model.
XR systems support tracking of users' hands to allow direct interaction and manipulation of virtual objects in XR applications. This feature provides sites an interface through WebXR to make use of these system hand trackers in a standardized way.
https://immersive-web.github.io/webxr-hand-input/
None
Pending
None
Gecko: No signal
WebKit: No signal
Web developers: No signals
No
None
No
https://www.chromestatus.com/feature/5719474055413760
This intent message was generated by Chrome Platform Status.
--
You received this message because you are subscribed to the Google Groups "blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.
To view this discussion on the web visit https://groups.google.com/a/chromium.org/d/msgid/blink-dev/DM6PR21MB16269E4CFA452425B7F980B0D3160%40DM6PR21MB1626.namprd21.prod.outlook.com.
Sounds good to me! The plan is to implement it atop the equivalent OpenXR spec which Mozilla was also consuming so that would be helpful.