RE: [webvr-internal] WebXR Device API call agenda, Jan 23rd 2018

Apologies for the late sending of these notes!


Brandon – Describing WebXR input proposal. The proposal has the intent of going back and extending it with input APIs from all third party libs once we understand what they all are.
Iker – Questioned the relationship to gamepad API.
Brandon – Concerned that the feature proposal section might be interpreted as the first round of proposals.
Kip – raises whether we should make it explicit to render controllers (might not want to do it for AR scenarios)
Nell – Raises that we might be able to base the decision based on mode (see-through rendering or not) or based on the presence of models for the platform
Brandon – The page may be able to track which mode it’s in. Potentially want to make a different cursor based on type of pointer (gaze, 6dof, 3dof, screen interactions, etc…).
Nell – The concern around drawing a cursor around screentouch was around whether it was necessary as opposed to being a developer decision. Raises that HoloLens hands would be recognized as an input source
Brandon – Feels weird about having a hand (that could show up at any time) enumerate as a constant input device, even though they can come in and out.
Nell – The hands are tracked, but no orientation data. Raises that we could have sensible fallbacks for devices that don’t have orientation, but the fallback hasn’t been decided yet. Wants to create a consistent dev experience between a hand and controller.
Brandon – do you feel it’s appropriate to have a select gesture produce an input source? E.g. if I have a traditional gamepad, I probably don’t want to show it up as a VRController array (or a touchscreen/touchpad).
Nell – Agree, it feels weird but it also feels appropriate. She will follow up with Alex, who has some research around this.
Blair – Reflecting on use of pointer vs mouse events. It can be difficult for crossplatform for all various combinations. Suggests it would be good to have all the inputs go through a common API, as opposed to fall back to other input sources.
Brandon – Feels like you’re referring to gamepad or touch specifically. Thinking is that input from those should be exposed in a common way. Some categories of input act like a keyboard but may not have a pose of their own. Wants to expose that in a common way, that looks similar to VR controllers but maybe not the exact same way. But
Iker – Questions if that would fulfil all the usecases.
Brandon – Thinks this should be able to polyfill’d.
Nell – It may be able to connect a gamepad to an inputsource so you’re not duplicating input. If we wanted, we could expose gamepad as inputsourcearray but it would not have a grippose. The pointer ray could be from the gazepose. But today, there’s no relation between an inputsource and gamepad and that could be a gap.
Nell – Asks about the expectation for the input sources from webvr 1.1, when running WebXR? Do the Vive or MR controllers even appear in the gamepad array? When would we expect to get feedback on this API via origin trial?

Iker? – Raises – is it useful enough to have these kinds of controllers enumerate in the array if you need to listen to the events anyway?
Brandon – the purpose is to communicate how many devices there are (left vs right) and provide frame-to-frame tracking. The array would be useful to know which controllers to poll poses for.

Nell – In the context of the gamepads, what does it look like in a world when we get rid of gamepad extension and instead just link to a gamepad object off of the input source? E.g. if one is present, then either it’s a motion controller or it’s not. And you can tell the difference about whether there’s a grip pose. Does that alleviate the concern about whether touch appears in the array?
Brandon – So associate the gamepad in the input source as just a pointer to the gamepad object? That seems sensible to me but I don’t know if that answers the question about whether we should surface it in the input array. But I’m hesitant as it seems random devices enumerate to computers as gamepads and they fire off random events.
Iker – Why don’t you add the event handler to the input source?
Brandon – Agrees that that seems a little more extensible. It also localizes the API surface. But it means, anything you ever want to get input from does need to be surfaced in the input source array. That means you need to create an input source off a controller. Or in the case of HL gesture or screen-taps would also have to come through as input sources. But I’m worried that you have to go through every single source and check for select events. Alternatively, you can assign a “select” event for each type of input device and developers only need to listen to one event.


Brandon – Jordan posted an issue about displayed loading content without a subsequent user interaction.
Jordan – this is from the polyfill where we don’t have the same restrictions as entering VR on native. When entering VR users can display a custom splash-screen in previous scenarios.
Brandon – in the current polyfill, it instructs you to put your phone into landscape mode to enter the VR experience. Some developers have hijacked that to put their own loading screen/instructions. The issue is, if someone clicked EnterVR button in native implementation, the developer puts up a splash-screen, the user-activation event is lost. So you’d have to generate another user-activation event.
Jordan – it’s unclear how long that activation would last (4 seconds of 1 hr)?
Brandon – We want to avoid users putting on their headset, then taking it off, then putting it back on in cases where VR has not been entered.
David – It’s worth discussing some kind of splash-screen because some sites can take up to 15 seconds to start rendering, which is a long time. But it’s separate discussion from what the polyfill is.
Kip – Agrees.
Brandon – We’ll probably close this down as ‘not implement’ but we probably don’t want to encourage people to get too clever with it.
Kip – Perhaps we can solve this by implementing layers that asynchronously transform
Brandon – Agreed. Although that’s likely a separate discussion.


Brandon – Is there anything else anyone wants to discuss in the last 10 minutes?
Nell – we’ve been taking a look at what kind of data is necessary to give devs feedback on perf, given the capabilities of devices. We should get this on the agenda for next week’s call.
Brandon – Agrees.


Gheric – This relates to AR. Specifically thinking about how multiple layers might be supported- written by multiple authors- in an exclusive experience. At GT they’re experimenting with an AR browser. They’re looking at extending it with the WebXR polyfill. It seems the semantics for exclusive sessions would need to be changed for this use case. The exclusive session seems to be in contrast to the magic-window mode. It seems for long-term use cases, it might be better changed to immersive rather than exclusive to allow for multiple sessions simultaneously for AR (while this may not be useful for VR).
Nell – Blair and Nell had a conversation where they previously discussed that they were actively attempting to prevent this for user comfort and safety issues.
Blair – I recall that conversation. Thinks that the idea is not to ensure that it’s supported… but also make sure that it’s not prevented.
Kip – I think we should actively prevent it in the spec so that we don’t end up in a bifurcated WebXR space.
Nell – Microsoft is all for AR experiences, but not sure if WebXR is the proper vehicle for this.
Brandon – Agrees. But calls out that the shape of the API doesn’t prevent this in an exclusive mode… I can see a future where we have an exclusive mode that continues to do what it does. Then we can have a shared mode, where the semantics may be messy. It’s an important use case to keep in mind, but don’t think we need to force XR API to be the vehicle. Even if we do have a mode that does have this effect, I think the exclusive mode semantics are still useful as is for exclusive modes.
Nell – There are a couple of non-browsers browsers that help experiment with these kinds of investigations.
Kip – Gheric can continue to investigate with his own platform.

Brandon – Trevor will be organizing next week’s meeting.


From: 'Brandon Jones' via webvr-internal <webvr-internal@googlegroups.com>
Sent: Monday, January 22, 2018 3:02 PM
To: public-webvr@w3.org
Subject: [webvr-internal] WebXR Device API call agenda, Jan 23rd 2018

Getting this out a little late, sorry!

Call date: Tuesday Jan 23rd (and every other Tuesday thereafter)
Call time: 1:00 PM PST for one hour
Video call link: https://plus.google.com/hangouts/_/google.com/webxr-community?hceid=YmFqb25lc0Bnb29nbGUuY29t.jfn4fglc76h4ruldde8pq5s8kk&authuser=0<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fplus.google.com%2Fhangouts%2F_%2Fgoogle.com%2Fwebxr-community%3Fhceid%3DYmFqb25lc0Bnb29nbGUuY29t.jfn4fglc76h4ruldde8pq5s8kk%26authuser%3D0&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164061637&sdata=k9wA%2F3nGf%2FKLWEwZsY1eRqLNvG%2B2vutllstRKNQq3Zg%3D&reserved=0>

Call Agenda Items:

  *   Review simple input proposal<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fimmersive-web%2Fwebxr%2Fissues%2F319&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164061637&sdata=7rdNXMJkoz8WuHxHUIt%2BXcVlvxPa1iIbUkjlO%2FjfD5g%3D&reserved=0>
  *   Discuss issue Display loading content before presentation without subsequent user interaction<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fimmersive-web%2Fwebxr%2Fissues%2F315&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164061637&sdata=LnFERrvsyKAm43gQQKhfcigvJ%2FXONqyG3PBdUY2yUiM%3D&reserved=0>
  *   Discuss issue Change "exclusive" session parameter to "immersive"?<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fimmersive-web%2Fwebxr%2Fissues%2F320&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164071637&sdata=P5xUTod8p4KEgWresDfdhMPDp73GNfdlBD1I1C%2B%2BD34%3D&reserved=0>
In addition there are some recent issues on the GitHub tracker that do not necessarily need to be brought up on the call but deserve some attention for the group regardless:

  *   Use Texture Arrays directly with WebGL 2<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fimmersive-web%2Fwebxr%2Fissues%2F317&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164071637&sdata=lHn3BpqJxpWKOqWtQczWVW9rB4eTJ%2BV%2BAhix3OAYhu0%3D&reserved=0>
  *   Beginning media playback once VR has started<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fimmersive-web%2Fwebxr%2Fissues%2F316&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164071637&sdata=j7Ahfglmkm4vGKPYV3QTMAcN%2FHxPqO%2B4qtk%2Bc0MbLSw%3D&reserved=0>
And as always if you have something you'd like to see discussed on the call please reply with suggested topics!

Regarding the video call link: I apologize but I neglected to do the research I was hoping to do on a more generally accessible VC system in time for this weeks call. I've switched to using Hangouts this week instead of Meet because previously that appeared to have better cross-browser compatibility, and it was simple for me to do. I don't consider this to be an acceptable long term solution, though, because it doesn't have a call-in number for participants who would like to join by phone. If anyone has prior experience setting up a VC system for similar groups in the past, let's talk! Would be good to get some feedback on what's worked well for others in the past.

Talk to you all tomorrow!

--Brandon
--
You received this message because you are subscribed to the Google Groups "webvr-internal" group.
To unsubscribe from this group and stop receiving emails from it, send an email to webvr-internal+unsubscribe@googlegroups.com<mailto:webvr-internal+unsubscribe@googlegroups.com>.
To post to this group, send email to webvr-internal@googlegroups.com<mailto:webvr-internal@googlegroups.com>.
To view this discussion on the web visit https://groups.google.com/d/msgid/webvr-internal/CAEGwwi22df82ghfxjWorBh1UStVeuCgh5ndfXQH5-ThVVQOO6Q%40mail.gmail.com<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Fwebvr-internal%2FCAEGwwi22df82ghfxjWorBh1UStVeuCgh5ndfXQH5-ThVVQOO6Q%2540mail.gmail.com%3Futm_medium%3Demail%26utm_source%3Dfooter&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164071637&sdata=CgVXsjS6Qd3%2FIDzU98a%2FzSHy5heva3D8LqD4v2vAjkk%3D&reserved=0>.
For more options, visit https://groups.google.com/d/optout<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Foptout&data=02%7C01%7Cnell.waliczek%40microsoft.com%7C5af2a196108a4510a78b08d561ec1d8d%7Cee3303d7fb734b0c8589bcd847f1c277%7C1%7C0%7C636522589164071637&sdata=DOgzL%2BortC5WgFsYxYSYT4JdTCW43TVnYqe7nxqU%2F48%3D&reserved=0>.

Received on Wednesday, 21 February 2018 18:32:42 UTC