Introducing myself and offering to contribute

Hi Everyone!

Firstly, great work on WebXR so far. I've spent the last week or so deep-diving into the current state of the specifications and Chrome implementation, and it's moved on a great deal from when I previously looked into it a couple of years ago.

The WebXR APIs for headsets and tracked controllers look great. I especially like how the input abstraction works all the way from screen taps, via simple Cardboard-style gaze and tap, to fully-tracked controllers. We're actually going to be attempting to implement the headset and controller APIs for our Zapbox product (which I'll send a separate mail about as I think it might interest members of the group), but on first look the current specifications seem to cover everything we need.

Where I would like to contribute to discussions is around handheld mobile AR, where I think there's scope for further improvements to better support real-world use cases. I'm happy to join any appropriate groups or conference calls - let me know the best way to contribute.

I have a couple of specific suggestions that I'll open GitHub issues about (I'm tangobravo on GitHub), but I thought this initial introduction was better suited for the mailing list.

My background and relevant experience

I've been involved in handheld mobile AR since starting my PhD in 2007 (per-frame image marker detection running at ~30FPS on the 300MHz armv6 Nokia N95). I co-founded Zappar back in 2011 and have been there ever since as the Chief R&D Officer.

We have an in-house AR Creative Studio that has delivered over 1000 projects over the last decade, which gives us a good understanding of the practical requirements and expectations for commercial AR projects. Over recent years we've witnessed a steady shift from native app deployments towards those that target the web.

From the technical side, we have our own cross-platform engine for AR content with native iOS and Android runtimes. When the combination of WebAssembly, Device Motion, getUserMedia() and WebGL shipped on mobile browsers we were also able to implement a web runtime for the same engine - mainly WebAssembly for the core but with a fair amount of JS implementation code for the runtime features and camera capture pipeline.

Our content API exposes our own computer vision implementations for various tracking types, plus an abstraction around World Tracking for the native runtimes backed by ARKit and ARCore, so I also have first-hand experienced of the implementation challenges of providing a consistent API layer on top of those frameworks.

More recently we've added lower-level SDKs to bring our Computer Vision code directly to third-party engines that target WebGL, and we've released our first beta World Tracking implementation for the web that offers reasonable tracking quality, though still with a significant gap to the quality offered by the native frameworks. There's certainly scope for implementation improvements on our side, but motion data in particular is more heavily quantized and rate-limited on the web so there will probably always be a gap in quality.

Therefore we have a strong interest in leveraging WebXR where available as a progressive enhancement, and I'm keen to get involved in shaping the specifications to really make WebXR shine for our typical handheld AR use cases.

I'll leave the specific details of my suggestions for the GitHub issues, and look forward to discussing in more detail over there!

Nice to meet everyone,

Simon

 
Dr. Simon Taylor
Chief R&D Officer and Co-Founder
Zappar
zappar.com <https://goo.gl/VyR2Do>   zap.works <https://goo.gl/FDwbxY>
  <https://www.facebook.com/ZapparApp/>   <https://twitter.com/zapparapp?lang=en>    <https://www.linkedin.com/company/zappar/>    <https://www.instagram.com/zapparapp/?hl=en>
This email is sent on behalf of Zappar Limited a company registered in the United Kingdom with company no. SC394617. We can be contacted at the Barley Mow Centre, 10 Barley Mow Passage, London W4 4PH. The contents of this e-mail and any attachments are CONFIDENTIAL and may also be legally privileged. If you are not the intended recipient, you must not retain, copy or use this e-mail or any attachment for any purpose, nor disclose all or any part of the contents to any other person.

Received on Saturday, 12 March 2022 21:55:54 UTC