Intent to Implement: WebVR

1790 views
Skip to first unread message

Brandon Jones

unread,
Jun 26, 2014, 6:44:41 PM6/26/14
to blink-dev

Contact emails

baj...@chromium.org


Spec

No spec available currently, but I am working with Vladimir Vukicevic from Mozilla to deliver compatible APIs in both browsers. Mozilla’s current plans are outlined in this blog post: http://blog.bitops.com/blog/2014/06/26/first-steps-for-vr-on-the-web/


Their current IDL is viewable here:

https://github.com/vvuk/gecko-dev/blob/oculus/dom/webidl/VRDevice.webidl


Mozilla’s VR discussion mailing list is here:

https://mail.mozilla.org/listinfo/web-vr-discuss


Summary

The WebVR API will provide input/output for Virtual Reality head mounted displays (HMDs), such as the Oculus Rift, and potentially other six-degree-of-freedom devices.


Motivation

VR has become an area of intense research and developer interest recently, following the efforts of Oculus to bring cost-effective, performant hardware to developers and eventually consumers. This has prompted other large tech companies such as Sony and Samsung (rumored) to investigate the same space. Facebook had enough interest in the concept that they purchased Oculus for $2 billion.


Web developers are eager to experiment with this new medium but since the web lacks native support existing efforts have centered around plugins, local websocket services, or folded up pieces or cardboard. There is also no mechanism for outputting rendered content directly to the HMD. All existing web content requires display mirroring, which is awkward and requires disruptive OS configuration. A web-native API has the opportunity to reduce latency over existing solutions and eliminate awkward setup.


Compatibility Risk

Since the technology behind this feature is fairly new some standards and best practices are still being formulated. There’s only one company (Oculus) with an open SDK at this time, and so it’s unclear if the way they interact with the HMD hardware will become a defacto standard or if it will shift as more companies begin competing in this space. Of particular note, handling input from the HMD seems as though it will be relatively stable across all hardware (polling for position, orientation, acceleration, and velocity) but handling of output may vary between manufacturers.


With that in mind the intent is produce an experimental API that remains behind a flag until industry trends have indicated a clear trajectory. In the meantime the API should remain flexible to allow developers to experiment and provide feedback about what is required for and effective VR experience, but also allow the browser react to changes in the VR landscape quickly and without concern for breaking existing content.


Ongoing technical constraints

None.


Will this feature be supported on all five Blink platforms (Windows, Mac, Linux, Chrome OS and Android)?

Will be supported on Windows, Mac, and Linux. It seems likely that Chrome OS will support the API with the same code path as Linux, but I haven’t yet verified.


OWP launch tracking bug?

https://code.google.com/p/chromium/issues/detail?id=389343


Link to entry on the feature dashboard

http://www.chromestatus.com/features/4532810371039232


Requesting approval to ship?

Ha ha ha! No. There’s a long ways to go before we can consider shipping anything.

Adam Barth

unread,
Jun 26, 2014, 9:03:59 PM6/26/14
to Brandon Jones, blink-dev
Two questions:

1) Do you plan to implement this feature as a module or will it require code in core?
2) How does this feature align with our focus on mobile?

Adam

Brandon Jones

unread,
Jun 26, 2014, 10:07:06 PM6/26/14
to Adam Barth, blink-dev
I already have most of the initial feature at implemented (I wanted to ensure it was practical before proposing it to a wider audience.) The majority of the code is in a module. There is currently one small interaction with Element: a new optional parameter to requestFullscreen. It may be possible to refactor that code to live in module, however, or alter the proposed API to remove interaction with core. I'll post that code soon to facilitate this conversation better.

As for correlating with our mobile focus, I'll point out that our Cardboard experiment was one of the more talked about aspects of Google I/O. The team is currently relying on existing orientation events to achieve their VR effects, but there are some known issues such as sensor throttling that make it sub-optimal for VR experiences. Additionally the work done to do in-browser image distortion, as well as future work to reduce sensor-to-photon latency should be beneficial to mobile VR. Beyond that I'd like to work with the Cardboard team to determine what could improve their web experience and work that into the API as well.

I realize this API is bound to invite a lot of questions due to it's early state and technical requirements. I'm very happy to discuss (and change) any aspect of it to make sure it works well with the web platform as a whole.

--Brandon

Andrew Scherkus

unread,
Jun 26, 2014, 11:18:38 PM6/26/14
to Brandon Jones, Adam Barth, blink-dev
Personally I'm interested in the impact on Chromium as a whole.

For example, it appears the Firefox demo binaries link in the Oculus SDK -- does implementing this API imply Chromium would do the same?

Andrew

Brandon Jones

unread,
Jun 27, 2014, 12:17:05 AM6/27/14
to sche...@chromium.org, aba...@google.com, blin...@chromium.org
That's a great question, and one that I hoped to get some advice on, so I'm glad you brought it up.

My current code has the Oculus SDK sitting in third_party and is statically linking it. This is working pretty well, but the big question is one of license compatibility. I've attached the SDK license for reference, it's a custom license. It appears to be pretty permissive, but it's not for me to decide if it's compatible with Chrome's third party lib policy.

Mozilla ran into the same question, (https://groups.google.com/forum/#!topic/mozilla.dev.platform/VzsJjYpFRJs) and their solution was to build the library as a DLL and link it dynamically. Chrome could do that as well (The API is very straightforward C) though there's still a question of how it would be distributed. The SDK isn't configured as a dynamic lib as-is, and so we would still need to build and distribute the library or point people at a resource for getting a pre-built binary lib (though of course that would severely hurt adoption).

I'm honestly not very familiar with our policies regarding third_party, so I'd appreciate some pointers in the right direction. Regardless, we have options.

--Brandon
LICENSE.txt

Adam Barth

unread,
Jun 27, 2014, 12:30:54 AM6/27/14
to Brandon Jones, sche...@chromium.org, blin...@chromium.org
I don't think we should implement this feature in trunk.  As far as I can tell, WebVR unrelated to our goals for 2014 [1].  You are, of course, welcome to implement the feature in a branch.  Hopefully the module system makes that easier than it would be otherwise.

Brandon Jones

unread,
Jun 27, 2014, 1:02:54 AM6/27/14
to Adam Barth, sche...@chromium.org, blin...@chromium.org
That sounds perfectly reasonable to me.

Nico Weber

unread,
Jun 27, 2014, 12:39:45 PM6/27/14
to Brandon Jones, Adam Barth, Andrew Scherkus, blink-dev
The third party policies are documented at http://www.chromium.org/developers/adding-3rd-party-libraries

How large is the oculus dll? Is that really something we'd have to ship, instead of relying on users to install it?

Brandon Jones

unread,
Jun 27, 2014, 12:59:03 PM6/27/14
to tha...@chromium.org, aba...@google.com, sche...@chromium.org, blin...@chromium.org
Thanks for the third_party link!

I haven't built the code as a dynamic lib myself yet, but the one's that Mozilla have clock in at 661.504 kb for libovr64.dll and 1152.044 kb for libovr.dylib. There's a decent chunk of the code that Chrome doesn't use, though, so it may be that a statically linked version adds less bulk. I'll have to test it out and see how much it grows the binary size.

I'd personally prefer that users link against a library that was installed on their OS by an Oculus installer or another app, but that doesn't appear to be their model at this time. The SDK is currently geared towards an assumption that apps will statically link it.

Brandon Jones

unread,
Jun 27, 2014, 1:06:24 PM6/27/14
to tha...@chromium.org, aba...@google.com, sche...@chromium.org, blin...@chromium.org
Just tried it. On my OSX release build statically linking libovr added 330.52 kb to the size of Chromium.app

Nico Weber

unread,
Jun 27, 2014, 1:10:11 PM6/27/14
to Brandon Jones, Adam Barth, Andrew Scherkus, blink-dev
That seems pretty large, for something most users probably won't use. It'd probably have to be built as a dll and downloaded through the components updater.

Vangelis Kokkevis

unread,
Jun 27, 2014, 3:48:52 PM6/27/14
to Brandon Jones, Adam Barth, sche...@chromium.org, blink-dev
A lot of the functionality required by VR is generally useful to the web platform even outside the strict context of VR. It is also particularly relevant to mobile. For example, on the input side surfacing sensor data (accurate position, orientation, depth, etc) both on the device you hold in your hands and the one you're using for display would be a great addition to the platform. On the display side, being able to drive multiple displays with possibly different characteristics seems quite useful too.

Vangelis




To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.

Andrew Scherkus

unread,
Jun 27, 2014, 3:57:40 PM6/27/14
to Vangelis Kokkevis, Brandon Jones, Adam Barth, blink-dev
On Fri, Jun 27, 2014 at 12:48 PM, Vangelis Kokkevis <vang...@google.com> wrote:
A lot of the functionality required by VR is generally useful to the web platform even outside the strict context of VR. It is also particularly relevant to mobile. For example, on the input side surfacing sensor data (accurate position, orientation, depth, etc) both on the device you hold in your hands and the one you're using for display would be a great addition to the platform. On the display side, being able to drive multiple displays with possibly different characteristics seems quite useful too.

Agreed - but would that mean we'd be better off providing more primitive APIs for device / display enumeration on which something like WebVR could sit on top of?

I feel the same way with respect to lowering input-to-display latency: we should be doing that anyway for the platform as a whole. I don't think introducing a new API would necessarily help us here.

Adam Barth

unread,
Jun 27, 2014, 4:00:45 PM6/27/14
to Vangelis Kokkevis, Brandon Jones, Andrew Scherkus, blink-dev
IMHO, we have more basic problems to solve for mobile that we need to address first.  For example, we perform 14 hit tests on every gesture [1].  These are real problems faced by everyone who is trying to develop low latency content for the mobile web.

Adam



On Fri, Jun 27, 2014 at 12:48 PM, Vangelis Kokkevis <vang...@google.com> wrote:

crh...@gmail.com

unread,
Oct 31, 2014, 4:16:01 PM10/31/14
to blin...@chromium.org, vang...@google.com, baj...@google.com, sche...@chromium.org, aba...@google.com
Hi there!

I'm developing a browser based project that requires WebVR. While there are a number of issues with chrome that do need to be addressed, using that as a reason to delay or deny the implementation of WebVR is bad for the VR movement and bad for Chromium.

The underestimation of the impact of coming VR devices is a massive oversight. By delaying implementation, you are pushing developers of an industry that is soon to be booming basically all to Mozilla. Since Chrome is by far the most used browser this damages the VR movement because we have to either distribute special builds or tell people to switch. Also, in our tests, Chrome has been the most capable of handling our application.

At very least, I would hope that this feature would cross the desk of someone in Google that has a deep understanding of the future of VR and the potential revenue opportunities and then determine how to move forward.

By not implementing a browser based access to VR sensors/devices, the development ecosystem will highly favor vendor/implementation lock-in that will hurt the speed of growth and will also ensure that a corporation like Facebook, that purchased Oculus, will have the ability to launch and maintain a closed or highly controlled content distribution platform with minimal competition. Imo, that might qualify as 'being evil,' granted in the negative sense of responsibility.

Thank you for taking the time to read this and for any further consideration that occurs.

Thank you,
Casey Hancock
VCEMO CTO

Brandon Jones

unread,
Jan 15, 2015, 1:34:05 PM1/15/15
to crh...@gmail.com, blin...@chromium.org, vang...@google.com, sche...@chromium.org, aba...@google.com
Apologies for resurrecting a six month old thread, but I'm now looking at adding WebVR to Chrome proper (behind a flag). As such, I wanted to solicit this list again for opinions: If WebVR is something you're violently opposed to please speak now or forever hold your peace. :)

To provide an update on what's happened since the original Intent to Implement:

The WebVR feature has been in development in an experiemental branch, as suggested by Adam Barth. Currently there are implementations for both the Oculus Rift on Windows, Mac, and Linux and Google Cardboard on Android. You can find those branches here:

As I've been developing the feature I've made periodic binary builds available to developers, and Mozilla has been doing the same. These builds have proven popular with developers, and there's already a decent amount of content being built against them. Additionally, Mozilla has continued to experiment with the medium as a means of promoting it, and has created a very compelling experience at MozVR.com. There have also been community-driven Meetups started around the technology, including one tomorrow at Google SF.

The code I'm considering landing at this point is a subset of the experimental branches. You can see the proposed CLs here:

The plan at this point is to land only Cardboard support at first, since there are still some hurdles to clear in terms of integrating the Oculus Rift SDK into Chrome with minimal impact to users that lack the hardware. Additionally I'm going to initially avoid landing the modifications to the fullscreen API that are introduced in experimental branch, as they represent more invasive changes to the rendering pipeline and warrant further discussion before landing. The orientation and headset optics information made available in the initial patches are compelling features in their own right, though. I do anticipate that the feature will remain behind a flag for a while to come while the VR hardware landscape settles, but having the feature accessible from about:flags will make it much easier for interested developers to experiment

As always, I'm happy to answer any questions anyone may have or clarify any points of confusion.

--Brandon Jones

Elliott Sprehn

unread,
Jan 15, 2015, 3:47:59 PM1/15/15
to Brandon Jones, crh...@gmail.com, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
What's the binary size increase?

Nico Weber

unread,
Jan 15, 2015, 3:51:24 PM1/15/15
to Brandon Jones, crh...@gmail.com, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Thu, Jan 15, 2015 at 10:34 AM, 'Brandon Jones' via blink-dev <blin...@chromium.org> wrote:
Apologies for resurrecting a six month old thread, but I'm now looking at adding WebVR to Chrome proper (behind a flag). As such, I wanted to solicit this list again for opinions: If WebVR is something you're violently opposed to please speak now

Hello.

Brandon Jones

unread,
Jan 15, 2015, 4:52:45 PM1/15/15
to Elliott Sprehn, crh...@gmail.com, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Thu Jan 15 2015 at 12:47:55 PM Elliott Sprehn <esp...@chromium.org> wrote:
What's the binary size increase?

The patches I linked above make ChromeShell.apk 74kb larger.

On Thu Jan 15 2015 at 12:51:20 PM Nico Weber <tha...@chromium.org> wrote:
Hello.

Hi! I presume this is "speaking now", so what concerns can I address? If you would prefer to speak off-list I'm okay with that.

--Brandon

Nico Weber

unread,
Jan 15, 2015, 6:53:15 PM1/15/15
to Brandon Jones, Elliott Sprehn, crh...@gmail.com, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Thu, Jan 15, 2015 at 1:52 PM, 'Brandon Jones' via blink-dev <blin...@chromium.org> wrote:
On Thu Jan 15 2015 at 12:47:55 PM Elliott Sprehn <esp...@chromium.org> wrote:
What's the binary size increase?

The patches I linked above make ChromeShell.apk 74kb larger.

Is this all? Or are there any dlls we'd have to ship?
 

On Thu Jan 15 2015 at 12:51:20 PM Nico Weber <tha...@chromium.org> wrote:
Hello.

Hi! I presume this is "speaking now", so what concerns can I address? If you would prefer to speak off-list I'm okay with that.

The usual concerns of "it adds binary size, it needs security fuzzing, it makes things harder to maintain, less than 0.1% of chrome users have VR devices at this point".

How large is the diff from your branch to mainline? How much of it runs in the sandbox?

Brandon Jones

unread,
Jan 15, 2015, 7:31:49 PM1/15/15
to Nico Weber, Elliott Sprehn, crh...@gmail.com, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
Thanks for the response!

The 74kb increase includes packaging in Cardboard.jar, which is the only requirement for Android support.

As I mentioned in my email yesterday the goal right now is to add WebVR support on Android only. Oculus support is not being considered for merging at this time, and so there's no desktop DLLs to consider. Looking forward I don't feel like it's practical to add Oculus support or other similar desktop hardware unless their SDKs install DLLs that can be dyncamically linked by Chrome without requiring us to distribute them. (I do intend to continue maintaining Oculus support in my experimental branch in hopes that we can find a solution in this area.)

On the desktop low market penetration of the required hardware is a very legitimate concern. It's actually more like 0.0001% of Chrome users may have Oculus hardware. On Android, though, every device (minus a few gyro-less freaks) can be a VR device with a bit of cardboard and a couple lenses. Even without a harness, though, this API would be beneficial to developers that want to distribute photospheres, 360 video, or other applications in which your mobile device acts like a window into a scene.

The difference between the proposed patches and the experimental branches is a little difficult to gauge, since it contains things like the full Oculus SDK which isn't going to be considered for merging into Chrome any time soon, and also contains some experimental code paths which will probably not prove practical. The Blink branch is a little easier to compare, though: The CL contains 27 files, 21 of which are new (all but one of the new files are in modules/vr). The experimental branch contains 52 files, only one of which is new beyond what's in the CL. Most of those files are minor changes to plumb through a new optional argument to requestFullscreen.

Unfortunately I'm not sure I know the right way to answer your question about the sandbox. Interaction with the Cardboard API (and the Oculus API, though that's unimportant at this point) all happen in content/browser/vr, everything else is basically message plumbing. The code is generally modeled after the existing gamepad support. I believe this puts the hardware interaction outside of the sandbox.

Nico Weber

unread,
Jan 15, 2015, 7:34:07 PM1/15/15
to Brandon Jones, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Thu, Jan 15, 2015 at 4:31 PM, Brandon Jones <baj...@google.com> wrote:
Thanks for the response!

The 74kb increase includes packaging in Cardboard.jar, which is the only requirement for Android support.

As I mentioned in my email yesterday the goal right now is to add WebVR support on Android only. Oculus support is not being considered for merging at this time, and so there's no desktop DLLs to consider. Looking forward I don't feel like it's practical to add Oculus support or other similar desktop hardware unless their SDKs install DLLs that can be dyncamically linked by Chrome without requiring us to distribute them. (I do intend to continue maintaining Oculus support in my experimental branch in hopes that we can find a solution in this area.)

On the desktop low market penetration of the required hardware is a very legitimate concern. It's actually more like 0.0001% of Chrome users may have Oculus hardware. On Android, though, every device (minus a few gyro-less freaks) can be a VR device with a bit of cardboard and a couple lenses. Even without a harness, though, this API would be beneficial to developers that want to distribute photospheres, 360 video, or other applications in which your mobile device acts like a window into a scene.

What does Cardboard.jar do that can't be done in a js framework, using WebGL and the device orientation api?

Brandon Jones

unread,
Jan 15, 2015, 7:53:16 PM1/15/15
to Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
It's true that you can polyfill some of the WebVR functionality on top of device orientation events. The cardboard.jar can provide features like motion prediction, drift correction, and neck modeling that improve the quality of the signal, though. It also provides information about the optics of the harness being used so that applications can correctly render with the right IPD, field of view, and render target size.

There's also some historical issues that prevent devices orientation events from being appropriate for realtime content like VR. Until very recently it only sampled at 20Hz, though we were able to bump that up to 60Hz. Even with that increase, however, the polling is not fast enough to do high quality motion prediction. The fact that it's an event is problematic as well, since the events do not always arrive in sync with rAF events. This can cause stuttering as you either get two motion events for a single frame or no motion events at all. WebVR is explicitly a polling API, which avoids that problem completely.

There are additional benefits to using the Cardboard SDK that will come into play down the road, such as capturing magnet pull events, monitoring NFC tags, and providing correct image distortion, but those are not part of the initial CLs.

It should also be noted that WebVR relies on WebGL to render content, so we're not re-inventing any wheels there.

Philip Jägenstedt

unread,
Jan 16, 2015, 3:31:05 AM1/16/15
to Brandon Jones, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Fri, Jan 16, 2015 at 1:53 AM, 'Brandon Jones' via blink-dev
<blin...@chromium.org> wrote:
> There's also some historical issues that prevent devices orientation events
> from being appropriate for realtime content like VR. Until very recently it
> only sampled at 20Hz, though we were able to bump that up to 60Hz. Even with
> that increase, however, the polling is not fast enough to do high quality
> motion prediction. The fact that it's an event is problematic as well, since
> the events do not always arrive in sync with rAF events. This can cause
> stuttering as you either get two motion events for a single frame or no
> motion events at all. WebVR is explicitly a polling API, which avoids that
> problem completely.

Yeah, this is a problem with the deviceorientation and devicemotion
events. Assuming that the underlying hardware is the same, does the
WebVR solution entail synchronous IPC to get the very latest data, or
is there still some level of delay left? If you have solved the
problem, I think we should really turn vrSensor.getState() into a
general polling API with the same data representation as the
deviceorientation and devicemotion events so that people using the
events can easily migrate and get the same benefits.

Philip

Anne van Kesteren

unread,
Jan 16, 2015, 3:48:55 AM1/16/15
to Brandon Jones, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Fri, Jan 16, 2015 at 1:53 AM, 'Brandon Jones' via blink-dev
<blin...@chromium.org> wrote:
> The fact that it's an event is problematic as well, since
> the events do not always arrive in sync with rAF events.

This we can fix, no? HTML now defines the timing for
requestAnimationFrame in a lot more detail and allows for a bunch of
events that will influence layout to be synchronized with it.


--
https://annevankesteren.nl/

Philip Jägenstedt

unread,
Jan 16, 2015, 4:05:22 AM1/16/15
to Anne van Kesteren, Brandon Jones, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
I can't say I understand why, but I think for head-mounted gadgets
polling sensors faster than the frame rate somehow makes sense. "The
new Oculus VR™ sensor supports a refresh rate of up to 1000hz," says
https://www.oculus.com/blog/update-on-developer-kit-technology-shipping-details/

Philip

Brandon Jones

unread,
Jan 16, 2015, 12:31:47 PM1/16/15
to Philip Jägenstedt, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
vrSensor.getState() does perform a synchronous IPC at this point to ensure we get the absolute latest sensor values. This does introduce some overhead, but the overwhelming majority of latency actually comes from the rendering pipeline, which usually maintains a 2-3 frame buffer adding some 50ms of (currently) inescapable latency. Optimizing that is no small task, but I believe it can and should be done in a way that benefits the whole browser, not just VR.

In terms of improving the predictability of device orientation events: I'm all for it, but improvements to that API don't negate the need for a VR API. Looking forward to eventual desktop support, it's tempting to take headset motion and pipe it into the device orientation but this doesn't handle cases like Macbooks that actually do have an internal accelerometer which the web already exposes. It also ignores a theoretical case of multiple headsets connected to a single device, and fails to address the need to expose information about the headset optics to ensure correct rendering.

On Fri Jan 16 2015 at 1:05:18 AM Philip Jägenstedt <phi...@opera.com> wrote:
I can't say I understand why, but I think for head-mounted gadgets
polling sensors faster than the frame rate somehow makes sense. 

Polling at frequencies faster than the screen refresh allows for better head motion prediction, because when you begin rendering in VR you don't want to know where the user's head is now but instead where it will be in ~30ms (or however long it takes for your frame to reach the screen). 60hz is too coarse for that type of prediction, and would yield jittery results. By using purpose built VR like Cardboard.jar on the backend they can poll at whatever frequency they need to in their own thread in order to generate high-quality motion prediction.

--Brandon

Brandon Jones

unread,
Jan 17, 2015, 6:46:30 PM1/17/15
to Philip Jägenstedt, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
Relevant to this conversation: Mozilla announced at a WebVR-centric SFHTML5 event last night that they are landing WebVR support (behind a flag) in their nightly Firefox builds in the next couple of days. Sounds like they'll be talking more about it on a blog post at MozVR.com beginning of next week.

For what it's worth the event, hosted at Google SF, was really well attended! I think I heard there were some 400 people there. We handed out a bunch of Cardboard units. :) Lots of interest in the community around VR technology!

--Brandon

Sami Kyostila

unread,
Jan 19, 2015, 9:58:54 AM1/19/15
to Brandon Jones, Philip Jägenstedt, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
2015-01-16 17:31 GMT+00:00 'Brandon Jones' via blink-dev <blin...@chromium.org>:
vrSensor.getState() does perform a synchronous IPC at this point to ensure we get the absolute latest sensor values. This does introduce some overhead, but the overwhelming majority of latency actually comes from the rendering pipeline, which usually maintains a 2-3 frame buffer adding some 50ms of (currently) inescapable latency. Optimizing that is no small task, but I believe it can and should be done in a way that benefits the whole browser, not just VR.

Out of curiosity, does content using the WebVR prototype get comparable latency to a native Android application using the Cardboard SDK or does Chrome add some measurable overhead?
 
In terms of improving the predictability of device orientation events: I'm all for it, but improvements to that API don't negate the need for a VR API. Looking forward to eventual desktop support, it's tempting to take headset motion and pipe it into the device orientation but this doesn't handle cases like Macbooks that actually do have an internal accelerometer which the web already exposes. It also ignores a theoretical case of multiple headsets connected to a single device, and fails to address the need to expose information about the headset optics to ensure correct rendering.

On Fri Jan 16 2015 at 1:05:18 AM Philip Jägenstedt <phi...@opera.com> wrote:
I can't say I understand why, but I think for head-mounted gadgets
polling sensors faster than the frame rate somehow makes sense. 

Polling at frequencies faster than the screen refresh allows for better head motion prediction, because when you begin rendering in VR you don't want to know where the user's head is now but instead where it will be in ~30ms (or however long it takes for your frame to reach the screen). 60hz is too coarse for that type of prediction, and would yield jittery results. By using purpose built VR like Cardboard.jar on the backend they can poll at whatever frequency they need to in their own thread in order to generate high-quality motion prediction.

--Brandon

To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.

- Sami

Brandon Jones

unread,
Jan 19, 2015, 12:28:04 PM1/19/15
to Sami Kyostila, Philip Jägenstedt, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
Chrome definitely adds latency, primarily through it's multi-frame long rendering pipeline. I haven't been able to measure the actual motion-to-photon latency on Android yet, but on desktop where I can use some of the Oculus latency measurement tools it's about 64ms. On desktop, though, I am also using timewarp to compensate for some of the delay and it makes a massive difference in the apparent latency. Of course less actual latency would always be better.

Chrome's poor latency isn't something that only affects VR efforts, and we've had persistent complaints from developers trying to make realtime WebGL games. It's my hope that we can work to improve latency for all of these situations over the next year, as it would make a lot of developers very happy!

To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.

- Sami

Dominic Cooney

unread,
Jan 19, 2015, 8:02:49 PM1/19/15
to Brandon Jones, Sami Kyostila, Philip Jägenstedt, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
I commented privately to some Blink contributors, but let me repeat it in this forum:

WebVR puts pressure on low-latency input and rendering which is something we want to be able to do anyway for other use cases. So I wholeheartedly support implementing WebVR in trunk.

sid...@gmail.com

unread,
Jan 20, 2015, 4:58:23 AM1/20/15
to blin...@chromium.org, baj...@google.com
Hi I would like to contribute to the project, do let me know areas where you need specific contribution.
Thanks,
SIddharth.

Philip Jägenstedt

unread,
Jan 20, 2015, 7:00:09 AM1/20/15
to Brandon Jones, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Fri, Jan 16, 2015 at 6:31 PM, Brandon Jones <baj...@google.com> wrote:
> vrSensor.getState() does perform a synchronous IPC at this point to ensure
> we get the absolute latest sensor values. This does introduce some overhead,
> but the overwhelming majority of latency actually comes from the rendering
> pipeline, which usually maintains a 2-3 frame buffer adding some 50ms of
> (currently) inescapable latency. Optimizing that is no small task, but I
> believe it can and should be done in a way that benefits the whole browser,
> not just VR.
>
> In terms of improving the predictability of device orientation events: I'm
> all for it, but improvements to that API don't negate the need for a VR API.
> Looking forward to eventual desktop support, it's tempting to take headset
> motion and pipe it into the device orientation but this doesn't handle cases
> like Macbooks that actually do have an internal accelerometer which the web
> already exposes. It also ignores a theoretical case of multiple headsets
> connected to a single device, and fails to address the need to expose
> information about the headset optics to ensure correct rendering.

Supporting multiple motion sensors makes sense. Even so, would it make
sense to use the same representation for both the events and the
object returned by vrSensor.getState()? Concretely, that would be one
or two interfaces that are implemented by DeviceMotionEvent,
DeviceOrientationEvent and the object returned by vrSensor.getState().
If the same representation does not make sense, why not?

Philip

Kenneth Russell

unread,
Jan 20, 2015, 5:03:07 PM1/20/15
to Dominic Cooney, Brandon Jones, Sami Kyostila, Philip Jägenstedt, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Mon, Jan 19, 2015 at 5:02 PM, Dominic Cooney <domi...@chromium.org> wrote:
> I commented privately to some Blink contributors, but let me repeat it in
> this forum:
>
> WebVR puts pressure on low-latency input and rendering which is something we
> want to be able to do anyway for other use cases. So I wholeheartedly
> support implementing WebVR in trunk.

I'd like to echo support for implementing WebVR on trunk. Having the
code in place, even behind a flag, will make it much easier for
developers to begin experimenting with the API on multiple platforms.

The form of the API will surely evolve, but note that there are some
aspects to it -- such as taking the canvas's container element
fullscreen in "VR" mode -- which aren't addressed by any other web
API. I'm in favor of implementing the currently proposed WebVR APIs
behind a flag, and evolving them iteratively as more experience is
gained with them.

-Ken
>>>> an email to blink-dev+...@chromium.org.
>>>
>>>
>>> - Sami
>
>

Sami Kyostila

unread,
Jan 22, 2015, 3:43:27 PM1/22/15
to Kenneth Russell, Dominic Cooney, Brandon Jones, Philip Jägenstedt, Anne van Kesteren, Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
Great point about the latency benefits. I'd also love to see this getting landed behind a flag.

- Sami

Nico Weber

unread,
Jan 22, 2015, 4:05:00 PM1/22/15
to Brandon Jones, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Thu, Jan 15, 2015 at 4:53 PM, Brandon Jones <baj...@google.com> wrote:
It's true that you can polyfill some of the WebVR functionality on top of device orientation events. The cardboard.jar can provide features like motion prediction, drift correction, and neck modeling that improve the quality of the signal, though. It also provides information about the optics of the harness being used so that applications can correctly render with the right IPD, field of view, and render target size.

There's also some historical issues that prevent devices orientation events from being appropriate for realtime content like VR. Until very recently it only sampled at 20Hz, though we were able to bump that up to 60Hz. Even with that increase, however, the polling is not fast enough to do high quality motion prediction. The fact that it's an event is problematic as well, since the events do not always arrive in sync with rAF events. This can cause stuttering as you either get two motion events for a single frame or no motion events at all. WebVR is explicitly a polling API, which avoids that problem completely.

There are additional benefits to using the Cardboard SDK that will come into play down the road, such as capturing magnet pull events,

Shouldn't that be done through e.g. the gamepad api? Seems weird to have a dedicated api for this.

Anyhoo, it seems like this is an area explicitly called out as something to focus on this year (http://bit.ly/blinkon3-keynote , slide 20), so if this is really just 70kB I retract my concerns. Please keep an eye on reusing existing apis where it makes sense, and don't make the api to specific (say, don't have a "isMagnetPulled").

Is there a spec now?

Dimitri Glazkov

unread,
Jan 22, 2015, 5:15:50 PM1/22/15
to Nico Weber, Brandon Jones, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
I don't think we should block implementation and landing this code in trunk.

However, we should make an explicit goal to reconcile the current implementation into a more coherent story API-wise. As many peeps pointed out, the currently API looks like a set of VR-related sugar and convenience bits, conflated with specific polling rate/frame rate/device enum needs.

We should decouple the two, make the platform accommodate the latter, and let nascent libraries fill in the former.

:DG<

Brandon Jones

unread,
Jan 23, 2015, 6:47:39 PM1/23/15
to Nico Weber, Elliott Sprehn, Casey Hancock, blink-dev, Vangelis Kokkevis, Andrew Scherkus, Adam Barth
On Thu Jan 22 2015 at 1:04:58 PM Nico Weber <tha...@chromium.org> wrote:
Shouldn't that be done through e.g. the gamepad api? Seems weird to have a dedicated api for this.

I covered this privately with someone else on this thread, so I'll just copy my response here again:

We considered it! The biggest barrier there is that the Gamepad API is specced to normalize all axis inputs to [-1.0, 1.0]. This works out okay-ish for orientation, but when you start looking at things like position and acceleration falls apart. You can try and place limits on it (The [-1, -1] range on the 0, 1, and 2 axes represents a 5 meters cubed volume centered on the users initial head position. The 3, 4, 5 axes represent acceleration with 5Gs being the extremes) but it feels supremely awkward. Put simply, the gamepad API was never meant for anything but gamepads. (And a reasonably narrow subset of gamepad functionality at that.)

Anyhoo, it seems like this is an area explicitly called out as something to focus on this year (http://bit.ly/blinkon3-keynote , slide 20), so if this is really just 70kB I retract my concerns. Please keep an eye on reusing existing apis where it makes sense, and don't make the api to specific (say, don't have a "isMagnetPulled").

I don't intend to expose anything as specific as that, though developers do want to be able to see that action somehow. The current plan is that if we detect a magnet pull and we know the device is in a cardboard harness we generate a synthetic tap. Most Cardboard apps already handle taps anyway as an alias to magnet pulls and some third party Cardboards have mechanical triggers to generate taps anyway, so that would provide the desired input in a broadly consistent way without needing any new API surface.

More generally, there's absolutely a desire to take advantage of existing web APIs when it's practical to do so. If that ends up requiring some re-writing on WebVRs part I'm totally fine with that. That's why it's going to be behind a flag for the foreseeable future. :)

Is there a spec now?

Vladimir Vukićević (Mozilla) has been working on one. I'll see if we can get it up somewhere publicly accessible soon.

--Brandon
Reply all
Reply to author
Forward
0 new messages