- From: Max Rebuschatis <lincolnfrog@google.com>
- Date: Tue, 31 Jul 2018 14:12:18 -0700
- To: public-immersive-web@w3.org
- Message-ID: <CA+gQFCq+O7if203yvPHE=SKveVhKnxACrEzn0XU8HLKWBvTu8Q@mail.gmail.com>
Hi all, Here are my notes from today's meeting. Thanks to everyone for the discussion! *WebXR - AR Meeting Notes 2018/7/31Trevor: Let's get started! We have a couple administrative tasks to talk about 1. Upcoming face-to-face for AR topics2. Discuss group appetite to incorporate AR features into the initial version of WebXR1. Is lighting estimation part of this?2. issues and PRs around hit-testingTrevor: During the last face-to-face, we had a discussion about another face-to-face on Monday, September 17th in Mountain View. We will have dial-in options if you can't meet in person. General goal is to nail down what we want to take to the working group as an initial set of AR features. Hit-testing and anchors and maybe light estimation are far enough along that we can do it in such a way that we aren't preventing future extensions as technology unfolds. I am coordinating that so let me know if you have questions or agenda items.Trevor: The next topic is the main topic - there is appetite for having AR features in the initial version of WebXR API. The ones I have heard that we have rough consensus on are hit-testing and an initial, very simple anchors class where anchors are simple point anchors and *perhaps* planar anchors (with-or-without extents are up in the air). More recently, lighting estimation has been proposed as being more important and there is a simple version of that that definititely doesn't solve all the problems - the 2-float version (intensity and a color/temp value). Also there was some talk about geo-orientation. The idea is if you ask for a coordinate system you can ask the system to align it with some kind of geo-spatial direction like the x-axis points north - no GPS or anything like that, just orientation. The use-case for that is if you are in AR and you want to do things like the yelp display of relatively far away geo-spatially oriented data - like the cathedral is over that way, the river is over that way. The argument is its much much easier for the system underneath to do that than it would be to read the higher level APIs that are not timestamped / frame-synced in the same way. That is the initial set of features I have heard bandied about. Are there pieces here that people feel strongly about or known problems with any of these?John: As we are looking at this collection of APIs, the more diverse and dense it is, the more time it will take to figure it all out. There is a balancing act between the amount of capability in the first version and when it can be made availab.eTrevor: Hit-testing and anchors we have rough agreement on and everyone agrees are necessary and then lighting estimation is 3rd place and geo is last in terms of importance. Should there be a couple of different steps and just nail the first two before we move onto anything else?David: My understanding is we incubate these things separately and they mature at different rates based on interest. It makes sense to say "don't work on <crazy thing>" because its totally not ready, but I don't think we need to decide that these things will be ready at a specific point. Eventually, the working group will decide on the status of these things.Chris: The key thing is when it goes in the charter. The charter draft I am working on doesnt explicitly list lighting estimation but it could fit under some of the umbrellas. Hit-testing is definitely in there, anchors are to some degree.John: The features can be done in parallel but the charter is important - we don't want to include everything.Chris: if we include everything in the world, people expect everything will be ready at the same time. Charter timeframe is 12-18 months, which is the minimum to get ANYTHING to the process. Anyone joining the working group is giving up their IP for all of these features. Companies who join look at the scope of the charter to decide if they are comfortable with joining. If the charter is too vague or inclusive, then it discourages people from joining based on R&D efforts they might have around more boutique items.David: We might have an idea of what we want to do on lighting estimation for late next year, it doesn't need to be in the charter because there is still time for it to be picked up later / next charter.Trevor: points - what is the charter and what do we include. Do we want to try to nail down the specific pieces we want to put forward to the working group. The timeline for some of these is reasonable for 12-18 months, but the ones that are not nailed down at all are probably on a longer timeframe than that. In this conversation, I hoped we could get to the point where we have general agreement on the work that happens after this meeting and what could be mature enough in time to nail it down at the face-to-face. There seems to be no disagreement on hit-testing and anchors, but lighting doesn't even have an issue and geo does have an issue but there isn't much activity or interest expressed on github for that.Trevor: I'll put it to the floor - if someone puts up lighting estimation and runs with it and puts it up in the repo, obviously we can start looking at it.Chris: I don't think we need to worry too much about lighting estimation. The basic level of color temperature and intensity is already available in the basic lighting API. The text in the charter is simple enough that it seems like its ok to go in the charter. We can adopt it if someone picks it up and runs with it, but if nobody decides its important thats ok too - it won't be a failure if we don't deliver in the next 12 months.John: How does this relate to AR rendering / session creation? Is that called out?Chris: That is called out in the charter - please look and see what you think. I am in the process of filing a PR on the charter very soon.Trevor: Chris - do you think Geo is covered in any way by the charter or do we need to decide on that beore we file it?Chris: It's more of a stretch, but we could get away with it. You can put other stuff in your spec, but it's only an issue if someone formally objects that it was in your scope. I also don't think its super concerning in terms of IP rights - just using geo orientation to drive the axes of the world doesn't sound too worrisome or interesting.Trevor: The simplicity of it and the fact that it is relatively disconnected from most complex geo stuff makes it seem ok.Chris: If someone really wants this, they need to get the spec fleshed out ASAP and then the working group will decide whether its something we want to take or not.David: seems to early for this - no formal proposals for scope or features exist yet.John: Is this the one bus stop that if you miss it you have to wait 12-18 months to get it going if its outside the charter?Chris: If there is something clearly outside of the charter - like the charter literally says this is out of scope and the group decides it should be in scope, we would have to re-charter.John: what if geo orientation wasn't called out either way?Chris: Geo orientation could be considered ambient environmental information. Geo location is probably explicity excluded since we call out a "mechanism for doing global scale" as excluded. John: Is there a process to change the charter?Chris: It's the same process. You don't want to wait till the charter runs out to recharter.John: So there is some wiggle-room for new use-cases but there is a process to change it so we can figure this out moving forward.Chris: Yes, Stuff that is out of scope now may be moved from out to in.Yashar: So what is the cost to doing that? Does it affect timelines? Is it just paperwork?Chris: To recharter, you come up with a new charter. That gets sent to the W3C advisory comittee. Every company has a member on that committee. That committee votes within 30-45 days. When that charter takes effect, you have to re-sign up and your company has to approve your joining, probably after a legal review to see if its a concerning area of IP for your company.Yashar: so there is quite a bit of red tape if we have to recharter.Chris: I would not want to recharter every 3 months.Trevor: If we are highly suspicious things should be in the first wave, we should try to get them in. The only thing not covered is geo, it sounds like. We should follow up on that.Chris: This doesn't prevent you from doing an incubation in the group, FWIW. Multiple vendors can do work, test it out, etc. - we just can't actually ship it.Trevor: I will poke Blair about getting a geo issue. Maybe we can do the same thing for lighting estimation.David: goal of face-to-face is to determine which features are in/out?Trevor: No, the goal is to work through any of the issues that are outstanding since it seems like we are just going round and round with the issues without much progress. Its a time to build consensus that we are failing to build with the repos. A goal is by the end to have enough worked out that we can say "here is a list of things that should be in the first wave", but that is a side effect rather than a goal. David: Is that limited to the repos we have or are we discussing proposals as well?Trevor: Definitely proposals as well - we should discuss geo and lighting as well. Along those lines, we had a late agenda issue Max raised that is a good lead in to hit-testing. It seems like there are two or three different issues coming up for different reasons. 1. Pixel-perfect anchors. hit-test repo PR #31 / WebXR #384Jordan: I think #31 looks fine, though my main confusion is what determines whether we use reticle mode or object-placement mode. Do we have optional parameters for origin?Max: I'll briefly explain the design - the idea was to only have a single API for hit-test that would support both "absolute" object-placement and "relative" reticles, however really the ray is always considered to be relative to the frame-of-reference that is passed in. To do object placement, you would generate a ray from the pose of the device one of the global reference frames like "eye-level" or "stage" and then pass in that frame-of-reference along with the ray you generated. To do reticles, you would use a ray like (0, 0, 0) -> (0, 0, 1) along with a frame-of-reference you got from the device or an input-source. That frame of reference would need to be dynamic as discussed in WebXR #384, so we would update that FOR as part of generating the new frame. Then, as we process the hit requests for that frame, we would create absolute rays by combining the given ray with the updated FOR for the device/input-source and then return results that should be accurate for the devices pose on the new frame. This gives you all the functionality in a single API.Jordan: That sounds good! But, would the origin always be (0, 0, 0) for a ray relative to a device?Brandon: I don't think it would always be (0, 0, 0). You probably want to offset the origin of the ray based on the screen coordinates. You might want to trigger based on UI off-center. It seems smart to have a full ray in both cases - it's always appropriate. Trevor: Do you want to open a PR on this?Max: I already did - I actually merged a PR that described a lot of this functionality last week and then created #31 based on a document I wrote that we discussed internally. We were hoping for consent / dissent explicitly on this issue as finalized by hit-test repo PR #31 rather than just assume consent from silence.Trevor: Anyone want to express objections? <crickets>. Ok!Trevor: Let's move onto the anchors API discussion.2) Requesting an anchor based on a XRHitResult. Anchors issue #6.Max: I'll give the background really fast. Alex and I were having an email conversation about creating anchors based on arbitrary poses vs. passing in a XRHitResult directly. The result-based API is preferred in cases where an object is placed based on a hit-test as, in the future, it may contain trackable information and the underlying system can use that trackable to make higher-quality anchors that are connected to the plane or other element that was hit. You can, however, just get the pose from a XRHitResult and then create an arbitrary anchor based on that. Should we support both APIs? How do we ensure that developers use the result-based API when possible?Brandon: I am going to go out on a limb and say if there are two variants and there is one that is clearly preferred. It seems like a scenario where it doesn't hurt us if there aren't any real-world use-cases from developers to create arbitrary anchors.Max: arbitrary anchors does have use case - drawing / placing objects in the air.John: I want to make sure we aren't over-indexing on ARCore/Kit.Brandon: I am curious about windows mixed reality...Yashar: I think Alex was advocating that we should support arbitrary anchors even if some platforms don't support it. In WMR, you can create an anchor at an arbitrary point. It doesn't have to be on a hit-test. It turns out to be pretty useful. If you are in Hololens and you just place an object in the middle of a room, we will still create an anchor on that.John: Is the opposite true as well? Does hit-test always create an anchor?Yashar: I don't think so. We can serve the 3d mesh so you can just do arbitrary hit tests on your own. John: Thanks, that answers my question.Brandon: You can make the extrapolation that if a platform supports arbitrary anchor in space, you can support using a hit-result.Trevor: we use both APIs in our sample code. I don't have a strong argument though. It feels like we have consensus on "both"Yashar: I would definitely advocate for both!Max: this becomes a developer education issue - making sure they know why it is important to use the XRHitResult version of the API.Brandon: I think the dev will do the laziest thing possible. Why bother getting a coordinate out of the result when you can just call the function on the result directly. It seems like we have a good pit-of-success here.Trevor: I am wondering if we need extents and anchor offsets. I'll open an issue if I have anything smarter to say.Trevor: Ok, that's it for the agenda. Thanks for attending everyone!*
Received on Tuesday, 31 July 2018 21:13:20 UTC