Re: [w3ctag/design-reviews] WebXR Depth API (#550)

> One initial bit of feedback: would be good to see some non-visual use cases, such as realistic sound effects taking the room geometry into account. (...)

This is a great idea! I'll add it to the explainer - it should be possible to implement this with the API, but it may rely on having to perform a scene reconstruction using the data.

I think it'd be easier to initially ignore the `normTextureFromNormView` matrix (i.e. assume it's identity for a while) and focus on what data is being returned - I tried to explain it [Interpreting the data](https://github.com/immersive-web/depth-sensing/blob/main/explainer.md#interpreting-the-data) section.

`normTextureFromNormView` is there to allow devices to return the data in the same format irrespective of current device orientation and the coordinate system of the underlying platform - this way we don't have to perform the data adjustment in the implementation (that would be costly!), but it does mean the API is a bit more complicated to use. :( The explainer focuses on "what does the API provide?" instead of "why does the API provide it like this?", but if you think it'd be helpful, I can some text around design choices here.

Example: WebGL textures have the origin in bottom-left corner, but Android assumes that origin of the screen is in top-left corner - if we were to use the texture that we got from ARCore on Android as-is, everything would be flipped. Example 2: ARCore on Android always returns data assuming that the device is in landscape orientation - if it so happened that the user entered the AR session in portrait mode & we tried to use the texture as-is, everything would be rotated.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/w3ctag/design-reviews/issues/550#issuecomment-697753963

Received on Wednesday, 23 September 2020 17:33:07 UTC