Re: Modules split

On Mon, Aug 5, 2019 at 9:40 AM Brandon Jones <bajones@google.com> wrote:

> Hi Rik,
>
> I'm working on a more complete response, but I'm taking some time to make
> sure that I'm not innacurately representing anyone's position or offering
> any conflicting messages. But for the moment I want to make sure we're
> clear on the use and purpose of *environmentBlendMode:*
>
> We heard from the HoloLens team that they wanted to have the ability to
> display what was then the equivalent of *"immersive-vr"* content, but
> provide developers a hint to determine how it would actually be shown in
> order to allow them to either tweak the content for that scenario or
> possibly optimize it by avoiding rendering certain things that wouldn't
> display properly anyway (ie: certain shadow effects.) Thus by allowing
> *"immersive-vr"* sessions to specify an *"additive"*
> *environmentBlendMode* headsets like HoloLens or Magic Leap are allowed
> to advertise support for VR content if they so choose. It's also perfectly
> valid choice for a given platform to choose not to do that if they don't
> feel the user experience will be a positive one, as you indicated.
>

It would be good to hear from Microsoft why they wanted this (and not just
use 'immersive-ar').
Maybe we can continue this on github? I filed an issue there.

FWIW I think *environmentBlendMode *should only be meaningful in an
'immersive-ar' session. https://www.w3.org/TR/webxr/#xrsessionmode-enum already
implies this


> On Sat, Aug 3, 2019 at 12:52 PM Rik Cabanier <rcabanier@magicleap.com>
> wrote:
>
>>
>>
>> On Sat, Aug 3, 2019 at 12:13 PM Klaus Weidner <klausw@google.com> wrote:
>>
>>> Ah, I think I may have figured out a possible disconnect here that
>>> resulted from the spec split.
>>>
>>> Currently, the core spec still describes all the blend modes, and this
>>> is a read-only attribute that's implicitly set based on the requested mode
>>> and device type. Now that "immersive-ar" is no longer in the core spec,
>>> this is a bit confusing and may be open to misinterpretation, where
>>> developers may think they need to request a specific blend mode to do AR.
>>> For example:
>>>
>>> Note: Most Virtual Reality devices exhibit "opaque"
>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__immersive-2Dweb.github.io_webxr_-23dom-2Dxrenvironmentblendmode-2Dopaque&d=DwMFaQ&c=0ia8zh_eZtQM1JEjWgVLZg&r=jahSgznxrAL5kPgsRvs7bhKUEd9M5X0d-NE2WJg7VT0&m=mp7Tk_-cDQxwagKkH8OGIM8qDwGIWzGeoNkDmeHcpVw&s=ImGwnvU1JKWrUAu0GZU_2g_13Thjc3lXo-rSQkMM5DM&e=> blending
>>>> behavior. Augmented Reality devices that use transparent optical elements
>>>> frequently exhibit "additive"
>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__immersive-2Dweb.github.io_webxr_-23dom-2Dxrenvironmentblendmode-2Dadditive&d=DwMFaQ&c=0ia8zh_eZtQM1JEjWgVLZg&r=jahSgznxrAL5kPgsRvs7bhKUEd9M5X0d-NE2WJg7VT0&m=mp7Tk_-cDQxwagKkH8OGIM8qDwGIWzGeoNkDmeHcpVw&s=vYzYXJ7gRqpLf7tJDNWc-PnABrw-Vr5VlH_gVnsGG6Y&e=> blending
>>>> behavior, and Augmented Reality devices that use passthrough cameras
>>>> frequently exhibit "alpha-blend"
>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__immersive-2Dweb.github.io_webxr_-23dom-2Dxrenvironmentblendmode-2Dalpha-2Dblend&d=DwMFaQ&c=0ia8zh_eZtQM1JEjWgVLZg&r=jahSgznxrAL5kPgsRvs7bhKUEd9M5X0d-NE2WJg7VT0&m=mp7Tk_-cDQxwagKkH8OGIM8qDwGIWzGeoNkDmeHcpVw&s=CMsImDbEPMN_5zvdcZZjmdpfaFyH9-hNDdb8jwvf_KU&e=> blending
>>>> behavior.
>>>
>>>
>>> As far as I know there's currently no way to get "alpha-blend" in the
>>> core spec, that only makes sense for "immersive-ar". The blend mode is
>>> read-only, and "immersive-vr" would never be intended to use alpha
>>> blending. It should be opaque, but an AR headset with transparent optics
>>> that can't do that would report "additive" blending to inform an app about
>>> this. But this would be more along the lines of a compatibility mode to
>>> allow use of VR experiences on an AR headset, where a developer might use
>>> this blend mode information to avoid unintentionally-invisible black UI
>>> elements, but it would not be intended to be used for actual AR
>>> applications.
>>>
>>> I think this note and the blend modes in general need a clarification
>>> that this is functionality is primarily intended for future additional
>>> modes (even if not explicitly naming "immersive-ar" here). "immersive-vr"
>>> applications should expect "opaque", but should be aware that they may get
>>> "additive" on some devices that don't support opacity, and are encouraged
>>> to use a color scheme that works in this context if applicable. But make it
>>> clear that applications should not use "immersive-vr" for AR applications,
>>> and a note that "alpha-blend" is not used with "immersive-vr"?
>>>
>>
>>  environmentBlendMode should only apply to AR experiences. For
>> "immersive-vr", the attribute should always be "opaque".
>>
>>
>>> I agree that we shouldn't end up with developers using "immersive-vr" to
>>> write AR apps that then only work on AR headsets with "additive" blending
>>> behavior, for example these apps would not work for smartphone AR which
>>> uses "alpha-blend".
>>>
>>
>> Yes. Switching on this enum will cause confusion for authors.
>> Worse, if they don't switch on the enum, you get a very bad experience on
>> an AR headset where you fall over furniture or small children.
>>
>>
>>>
>>> On Sat, Aug 3, 2019 at 11:39 AM Klaus Weidner <klausw@google.com> wrote:
>>>
>>>> On Sat, Aug 3, 2019 at 12:49 AM Rik Cabanier <rcabanier@magicleap.com>
>>>> wrote:
>>>>
>>>>> On Sat, Aug 3, 2019 at 12:33 AM Klaus Weidner <klausw@google.com>
>>>>> wrote:
>>>>>
>>>>>> On Sat, Aug 3, 2019, 00:16 Klaus Weidner <klausw@google.com> wrote:
>>>>>>
>>>>>>> I'm not one of the spec editors, but is this really a blocking
>>>>>>> issue? The spec already says that *"Future specifications or
>>>>>>> modules may expand the definition of immersive session include additional
>>>>>>> session modes"*, and I think the initial AR module draft is
>>>>>>> starting imminently. Presumably the browser police won't be confiscating
>>>>>>> headsets for non-compliance if they implement a mode from a pending draft
>>>>>>> module that isn't in the draft core spec?
>>>>>>>
>>>>>>
>>>>>> Sorry, I didn't mean to imply that standards compliance is
>>>>>> unimportant. It would be unfortunate if there were an extended gap where
>>>>>> core WebXR is final but the AR module isn't ready yet even for minimal
>>>>>> "poses only" use cases, but my impression is that the editors and working
>>>>>> group are trying their best to avoid that. At this point it's all
>>>>>> technically still in draft status.
>>>>>>
>>>>>
>>>>> No worries! :-)
>>>>> Yes, I'd prefer if it goes in the spec since we don't know how long
>>>>> the AR module will take. We will be telling authors to use 'immersive-ar'
>>>>> and they might (rightly) be concerned that this is not in the standard.
>>>>>
>>>>> I'm concerned that the explainer is encouraging authors to request vr
>>>>> on ar devices and look at the environmentBlendMode attribute. We
>>>>> definitely don't want to support this and I suspect Microsoft will feel the
>>>>> same for the Hololens.
>>>>>
>>>>> What are '"minimal "poses only" use cases'?
>>>>>
>>>>
>>>> (Standard disclaimer, these are my unofficial opinions and
>>>> interpretations of the spec and process.)
>>>>
>>>> What I meant is that taking the current core spec and just adding an
>>>> "immersive-ar" mode results in an AR mode is extremely similar to
>>>> "immersive-vr" with a transparent background. Basically the app is just
>>>> getting poses relative to reference spaces but doesn't have any significant
>>>> real-world understanding. At most it can get a floor level by using a
>>>> "local-floor" reference space, but that's originally intended for a
>>>> limited-size space, not walking-around AR ("*the user is not expected
>>>> to move beyond their initial position much, if at all"*) and can't
>>>> cope with not-quite-flat environments. (I think issuing "reset" events when
>>>> the floor level changes wouldn't be in the spirit of the spec). In
>>>> "unbounded" space, there's no floor level available to the app. An app
>>>> could request both reference spaces and assume that the "local-floor" level
>>>> is valid globally, but that doesn't seem like a safe assumption.
>>>>
>>>> I was under the impression that some form of hit testing or other
>>>> real-world understanding such as planes or meshes is fairly essential for
>>>> AR applications, i.e. to support interacting with tables or walls, so I
>>>> thought the AR module was aiming to have something along these lines
>>>> included. If you think that it would already be very useful for AR headsets
>>>> to have a minimal AR mode without real-world understanding to avoid being
>>>> in unspecified territory, would it help to start with a very small
>>>> "poses-only AR" module that basically just introduces "immersive-ar" and
>>>> explanations around environment blending etc., but skipping XRRay and
>>>> anything related to hit testing or other real-world understanding? If yes,
>>>> I think this would be a useful discussion to have.
>>>>
>>>> It doesn't look good if a customer asks if we support WebXR and we say
>>>>> that it only works if they use a non-standard extension...
>>>>>
>>>>
>>>> That's kind of the question here - would apps really be able to work
>>>> just with core WebXR + "immersive-ar" for poses, or would they need
>>>> real-world understanding in some form also, in which case they'd
>>>> potentially be back to using not-yet-standard extensions? Is your point
>>>> that it would be important to distinguish this, i.e. by being able to say
>>>> that it is indeed using standard core WebXR + a standard "poses-only AR"
>>>> module that provides "immersive-ar", and there's also separate support for
>>>> draft real-world-understanding modules such as hit testing, plane
>>>> detection, anchors, etc.?
>>>>
>>>>>

Received on Monday, 5 August 2019 17:55:26 UTC