Re: [w3ctag/design-reviews] Review OffscreenCanvas, including ImageBitmapRenderingContext (#141)

I can definitely see why looking at the WebVR API and seeing Yet Another rAF™️ would be worrying, but it's not a route that the group has pursued lightly. Allow me to try and break down the logic behind it:

First and foremost is the semi-obvious requirement that the headsets need to run at a different rate than the main monitor and a purpose-built rAF is one practical way to pursue that. Certainly you could also try and adjust the throttling of the page as a whole, but my gut impression is that suddenly speeding up every rAF-based operation on the page because an unrelated API was called is a bad thing. In fact, it seems to grind directly against the concerns about isolation that have come up here.

Second, VR is an area where latency matters quite a bit, and we've seen with the existing spec that we're trying to replace that there's a lot of ways for developers to accidentally make things worse for themselves when you're polling device poses independently of pumping the frame loop. As a result we've made the design decision to have our variant of rAF also be the mechanism that supplies the device tracking data. That way we can make stronger guarantees about the relationship  between the pose we deliver and the frame that gets rendered in response.

This is also viewed as something of a security mechanism: We want to avoid a world where pages casually spin up highly accurate positional tracking in the background. Having a tight correlation between the render loop (for magic window in this case) and pose delivery ensures that we can do some basic checking around things like "You really should be rendering something in response to these poses or we're going to stop providing them." We can also easily correlate the frame loop with a specific output surface so that we can suspend it when the related element is scrolled off the screen which is not something that is practical with rAF. (Similarly in VR browsers we have several scenarios where the VR rAF needs to be suspended or throttled, say when using the VR controller to input a password, but we may still want to show the page itself at that time.)

So yes, it's something we've given a lot of thought to. Of course it would be ideal if there was the "one frame loop to rule them all" but I don't actually see that being practical when you have very specific needs like we do, especially given the relatively loose behavior of rAF as it's defined today.

To address a few other questions:

> To what extent is second-display (main-screen) a hard requirement for WebVR 2.0?

Not at all for the most performance sensitive systems (mobile), and mildly-important for desktop. Mainly because it would be a little weird if the browser just froze or blanked out whenever you started looking at VR content. If we had to suppress main-screen rendering while in VR initially we could do that, but it doesn't look like there's much technical reason to do so aside from performance concerns (which I'll talk about in a bit.)

> To what extent does WebVR want/need Offscreen Canvas support?

We don't *need* it, but we definitely want Offscreen Canvas to be a first class citizen with WebVR! The assumption has always been that we would be able to use it, and I can see multiple cases where it would be useful. I should note that similar to when using a normal canvas the intent would be for WebVR to still use it's own rAF when using Offscreen Canvas for all the reasons given above.

> What is the proposed method today for WebVR keeping parent document main threads from interfering in iframe'd VR content?

We are still discussing this as a group, especially after the TAG review. I haven't been viewing is as a critical "must solve prior to launch" problem, though. The API itself is being designed to be self contained with minimal dependencies on the DOM, mostly so it can function in workers, but also so that if we decide an isolated environment is beneficial it can work there easily. My primary concern with that type of environment, though, is that isolating WebVR from the DOM is really the easiest part of the problem. How to handle mouse/keyboard input or things like video playback in that kind of environment strike me as far harder issues.

I know that it's been proposed that we could spec out a specialized "meta-document" environment that gives you a performance-isolated place to play in, which sounds cool, but I would expect that WebVR would largely "just work" is such an environment and that it's specification is something that a much larger group than just the WebVR community group will want a hand in.

> How do engines decide to enter a "high-performance" mode for VR?

There's a couple of things you could be referring to here, and I'm not sure which. With mobile VR there is a "sustained performance mode" that almost all apps use, which is explicitly **not** high performance. Instead it's focused on running the device at a lower performance level that provides stronger guarantees about not being thermally throttled. This is something that kicks in automatically when pages begin presenting VR content today and the plan is to continue doing so.

There's reasons why apps may want to opt out of that mode, sometimes temporarily (such as to speed up loads), but I don't see that as critical to expose to the web at this point.

You may also be referring to how to trigger the appropriate GPUs in multi-GPU systems. This is actually addressed in the explainer. (Look for [setCompatibleVRDevice](https://github.com/w3c/webvr/blob/master/explainer.md#setting-up-a-vrlayer))

Finally you may be asking how to enter the theoretical "performance isolation" mode talked about in the previous questions answer, in which case I'd repeat that while we're discussing it we don't have solid plans at this point.

I'm super happy to discuss all of this to see if there's better solutions to be had, but I'm also wary of getting into a situation where forward progress on the WebVR spec and implementation is blocked on something like chasing an idealized uber-rAF.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/w3ctag/design-reviews/issues/141#issuecomment-338056380

Received on Thursday, 19 October 2017 22:32:41 UTC