- From: Brandon Jones <bajones@google.com>
- Date: Tue, 05 Sep 2017 16:47:52 +0000
- Cc: "public-webvr@w3.org" <public-webvr@w3.org>
- Message-ID: <CAEGwwi3nNzKrCZF0Ec+XfzL-CeO3uLSd_+LONj1=qq9g3yvtjQ@mail.gmail.com>
This is really a WebVR spec question, so I'm moving the webgl list to bcc. Fortunately for you, this has been a topic of much discussion within the WebVR group recently, and we have a proposed solution that we're going to implement in the next iteration of the spec. Essentially we're going to make VR input a first class citizen rather than a generic gamepad and additionally define certain types of "gestures" that are abstracted inputs across multiple devices. The first (and initially only) gesture will be "select", which is simply "The user performed the primary input." The exact action needed to trigger this will be platform/device specific, but always something that follows the guidelines for that system: Cardboard: Tap the button/screen Daydream: Click the touchpad GearVR without controller: Tap the headset touchpad Rift/Vive/GearVR with controller: Click the trigger HoloLens: Perform an air tap Any of the above with a gamepad: Press the "X" button or similar And so on. This will fire a "select" event that will indicate, amongst other things, a ray into the scene that corresponds with the event. For motion controllers the ray will originate at the controller and run along the controllers natural pointing direction. For inputs that don't have motion tracking (Cardboard, GearVR w/ touchpad, Rift with remote, etc.) the ray will originate at the users head and point outward, creating a gaze cursor. This way for the simplest point-and-click interfaces the developer doesn't need to worry about detecting the capabilities of the system or doing massive swtich statements based on device name, they just respond to the rays given to them by the select event. (I've got some early demos in place to prove this concept out and they work spectacularly!) Needless to say, this system takes heavy inspiration from Boris' wonderful Ray.js library. We'll share more details as they get finalized (though you could learn a lot by poking around in the WebVR W3C spec repo on GitHub) but hopefully that gives you enough to know that we've been worried about the same things and actively addressing them! --Brandon On Mon, Sep 4, 2017 at 3:57 AM Evgeny Demidov <demidov@ipm.sci-nnov.ru> wrote: > Sorry, I hope there are more developers here (than on webvr.slack.com :) > > https://www.ibiblio.org/e-notes/webvr/samples.htm > I'd like "walk" forward by pressing the Cardboard button but what will > you press on Oculus, PC ... then? > Is it possible to add universal "button" event in webvr-polyfill ? > > "What would the web look like if there were no scrollbars, no mouse > cursors, and no clickable links? That's what VR is like today. On one > hand, this is great! Developers are completely free to build however > they want, leading to a lot of interesting experiments. On the other > hand, it takes a lot of engineering effort to just get basic > interactions up and running. Furthermore, it lacks consistency. The > alluring promise of being able to navigate from world to world may be > diluted by the frustration of having to rediscover new interaction > paradigms every time." > http://smus.com/ray-input-webvr-interaction-patterns/ October 11, 2016 > > Evgeny > > -- > You received this message because you are subscribed to the Google Groups > "WebGL Dev List" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to webgl-dev-list+unsubscribe@googlegroups.com. > For more options, visit https://groups.google.com/d/optout. >
Received on Tuesday, 5 September 2017 16:49:24 UTC