Contact emails
Spec
No spec available currently, but I am working with Vladimir Vukicevic from Mozilla to deliver compatible APIs in both browsers. Mozilla’s current plans are outlined in this blog post: http://blog.bitops.com/blog/2014/06/26/first-steps-for-vr-on-the-web/
Their current IDL is viewable here:
https://github.com/vvuk/gecko-dev/blob/oculus/dom/webidl/VRDevice.webidl
Mozilla’s VR discussion mailing list is here:
https://mail.mozilla.org/listinfo/web-vr-discuss
Summary
The WebVR API will provide input/output for Virtual Reality head mounted displays (HMDs), such as the Oculus Rift, and potentially other six-degree-of-freedom devices.
Motivation
VR has become an area of intense research and developer interest recently, following the efforts of Oculus to bring cost-effective, performant hardware to developers and eventually consumers. This has prompted other large tech companies such as Sony and Samsung (rumored) to investigate the same space. Facebook had enough interest in the concept that they purchased Oculus for $2 billion.
Web developers are eager to experiment with this new medium but since the web lacks native support existing efforts have centered around plugins, local websocket services, or folded up pieces or cardboard. There is also no mechanism for outputting rendered content directly to the HMD. All existing web content requires display mirroring, which is awkward and requires disruptive OS configuration. A web-native API has the opportunity to reduce latency over existing solutions and eliminate awkward setup.
Compatibility Risk
Since the technology behind this feature is fairly new some standards and best practices are still being formulated. There’s only one company (Oculus) with an open SDK at this time, and so it’s unclear if the way they interact with the HMD hardware will become a defacto standard or if it will shift as more companies begin competing in this space. Of particular note, handling input from the HMD seems as though it will be relatively stable across all hardware (polling for position, orientation, acceleration, and velocity) but handling of output may vary between manufacturers.
With that in mind the intent is produce an experimental API that remains behind a flag until industry trends have indicated a clear trajectory. In the meantime the API should remain flexible to allow developers to experiment and provide feedback about what is required for and effective VR experience, but also allow the browser react to changes in the VR landscape quickly and without concern for breaking existing content.
Ongoing technical constraints
None.
Will this feature be supported on all five Blink platforms (Windows, Mac, Linux, Chrome OS and Android)?
Will be supported on Windows, Mac, and Linux. It seems likely that Chrome OS will support the API with the same code path as Linux, but I haven’t yet verified.
OWP launch tracking bug?
https://code.google.com/p/chromium/issues/detail?id=389343
Link to entry on the feature dashboard
http://www.chromestatus.com/features/4532810371039232
Requesting approval to ship?
Ha ha ha! No. There’s a long ways to go before we can consider shipping anything.To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.
A lot of the functionality required by VR is generally useful to the web platform even outside the strict context of VR. It is also particularly relevant to mobile. For example, on the input side surfacing sensor data (accurate position, orientation, depth, etc) both on the device you hold in your hands and the one you're using for display would be a great addition to the platform. On the display side, being able to drive multiple displays with possibly different characteristics seems quite useful too.
Apologies for resurrecting a six month old thread, but I'm now looking at adding WebVR to Chrome proper (behind a flag). As such, I wanted to solicit this list again for opinions: If WebVR is something you're violently opposed to please speak now
What's the binary size increase?
Hello.
On Thu Jan 15 2015 at 12:47:55 PM Elliott Sprehn <esp...@chromium.org> wrote:What's the binary size increase?The patches I linked above make ChromeShell.apk 74kb larger.
On Thu Jan 15 2015 at 12:51:20 PM Nico Weber <tha...@chromium.org> wrote:Hello.Hi! I presume this is "speaking now", so what concerns can I address? If you would prefer to speak off-list I'm okay with that.
Thanks for the response!The 74kb increase includes packaging in Cardboard.jar, which is the only requirement for Android support.As I mentioned in my email yesterday the goal right now is to add WebVR support on Android only. Oculus support is not being considered for merging at this time, and so there's no desktop DLLs to consider. Looking forward I don't feel like it's practical to add Oculus support or other similar desktop hardware unless their SDKs install DLLs that can be dyncamically linked by Chrome without requiring us to distribute them. (I do intend to continue maintaining Oculus support in my experimental branch in hopes that we can find a solution in this area.)On the desktop low market penetration of the required hardware is a very legitimate concern. It's actually more like 0.0001% of Chrome users may have Oculus hardware. On Android, though, every device (minus a few gyro-less freaks) can be a VR device with a bit of cardboard and a couple lenses. Even without a harness, though, this API would be beneficial to developers that want to distribute photospheres, 360 video, or other applications in which your mobile device acts like a window into a scene.
I can't say I understand why, but I think for head-mounted gadgets
polling sensors faster than the frame rate somehow makes sense.
vrSensor.getState() does perform a synchronous IPC at this point to ensure we get the absolute latest sensor values. This does introduce some overhead, but the overwhelming majority of latency actually comes from the rendering pipeline, which usually maintains a 2-3 frame buffer adding some 50ms of (currently) inescapable latency. Optimizing that is no small task, but I believe it can and should be done in a way that benefits the whole browser, not just VR.
In terms of improving the predictability of device orientation events: I'm all for it, but improvements to that API don't negate the need for a VR API. Looking forward to eventual desktop support, it's tempting to take headset motion and pipe it into the device orientation but this doesn't handle cases like Macbooks that actually do have an internal accelerometer which the web already exposes. It also ignores a theoretical case of multiple headsets connected to a single device, and fails to address the need to expose information about the headset optics to ensure correct rendering.On Fri Jan 16 2015 at 1:05:18 AM Philip Jägenstedt <phi...@opera.com> wrote:I can't say I understand why, but I think for head-mounted gadgets
polling sensors faster than the frame rate somehow makes sense.Polling at frequencies faster than the screen refresh allows for better head motion prediction, because when you begin rendering in VR you don't want to know where the user's head is now but instead where it will be in ~30ms (or however long it takes for your frame to reach the screen). 60hz is too coarse for that type of prediction, and would yield jittery results. By using purpose built VR like Cardboard.jar on the backend they can poll at whatever frequency they need to in their own thread in order to generate high-quality motion prediction.--Brandon
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.
- Sami
It's true that you can polyfill some of the WebVR functionality on top of device orientation events. The cardboard.jar can provide features like motion prediction, drift correction, and neck modeling that improve the quality of the signal, though. It also provides information about the optics of the harness being used so that applications can correctly render with the right IPD, field of view, and render target size.There's also some historical issues that prevent devices orientation events from being appropriate for realtime content like VR. Until very recently it only sampled at 20Hz, though we were able to bump that up to 60Hz. Even with that increase, however, the polling is not fast enough to do high quality motion prediction. The fact that it's an event is problematic as well, since the events do not always arrive in sync with rAF events. This can cause stuttering as you either get two motion events for a single frame or no motion events at all. WebVR is explicitly a polling API, which avoids that problem completely.There are additional benefits to using the Cardboard SDK that will come into play down the road, such as capturing magnet pull events,