Re: Chrome WebVR avaliable only on secure origins

On Wed, Jul 13 2016 at 04:29, Brandon Jones <bajones@google.com> wrote:

 > We realize that some developers have strong opinions on this subject. We
 > welcome feedback, *especially *if this policy makes your planned use 
case
 > infeasible! But we also feel that the development community around a new
 > feature like this is actually in the best position to gracefully handle
 > this requirement. WebVR projects are less likely to have large 
amounts of
 > legacy code that needs to be updated to support HTTPS. Additionally,
 > efforts like Lets Encrypt are in full swing and make it easier than 
ever to
 > make your sites secure.


Thanks for the advanced warning on this change.  This will definitely 
cause some problems for one of my use cases.  Let me explain.

My current project is JanusWeb, an in-browser WebGL/WebVR client for 
JanusVR rooms.  JanusVR rooms are 3d worlds defined by simple JSON or 
XML mark-up which can be written by any user, and hosted on any website 
- similar in concept to A-Frame.  These rooms can be linked together 
with portals, allowing for seamless movement from room to room, even 
though each room may be hosted on different servers, controlled by 
different entities. Originally these rooms were written to be viewed in 
the native JanusVR client, but JanusWeb makes that content accessible 
directly in the browser.

So here's the problem.  The vast majority of these rooms are being 
created by amateur users, not professional web developers.  The pages 
are being hosted in a wide variety of ways - some content is hosted on 
large-scale professional hosting services specifically geared towards 
hosting VR content, while the rest is scattered all over on a hodgepodge 
of personal webservers, free web hosting services, AWS, Dropbox, 
distributed services like IPFS, or even just embedded in temporary pages 
like Pastebin.

Since this content is hosted in a completely decentralized fashion, it's 
a usability challenge to tell web users "you can only view this content 
in VR if the site you hosted it on uses HTTPS," especially when the 
content authors may be people who have enough technical ability to 
cobble together some JSON or XML mark-up, but not necessarily enough to 
register a domain, set up a server, run Lets Encrypt, and configure 
everything properly.

To give some real-world numbers, of a sampling of 3200 existing JanusVR 
rooms, only 300 are currently hosted on HTTPS hosts.  So if WebVR is 
only available for HTTPS websites, that means only 10% of this 
particular ecosystem would be viewable in VR.  I would expect a similar 
distribution for related projects like A-Frame. This of course poses 
usability challenges - do you drop someone out of VR when they walk 
through a portal to an HTTP website and force them to switch to 2D 
browsing mode?

Another issue with HTTPS is the Schroedinger-like nature of HTTP 
caching.  I can configure my server with all the right headers to tell 
the browser to cache my content, but because it's HTTPS the browser may 
decide to ignore that.  My understanding (it's been a few years since I 
extensively tested this so please correct me if I'm wrong) is that most 
browsers refuse to cache HTTPS pages to disk across browser loads for 
security reasons.  3D content tends to be quite large, and not being 
able to cache effectively is a serious hindrance.  Yes, there are tricks 
like storing assets to IndexedDB or localStorage, but these are merely 
hacks which are subject to their own restrictions and pitfalls.

VR and WebVR in particular are nascent technologies and I think we're 
seeing that a lot of the early experimentation and implementation is 
being done by amateurs.  Amateurs are trying out all kinds of different 
experiences, experimenting with different means of displaying data and 
moving around and manipulating worlds, and generally doing all the 
research which professionals and companies aren't yet willing to commit 
full resources to.

This early experimentation is what will help the industry evolve a set 
of common best practices, and I think it's important to keep the barrier 
to entry for users to create content for this new medium as low as 
possible.  Currently, it feels a lot like the early weirdness of the 
World Wide Web in the 90s, when nobody really yet knew what to do with 
it and anything was fair game. The fact that anyone could just open up a 
text editor and start creating new content allowed us to quickly figure 
out what did and didn't work, and the entirely new fields of 
professional web design, web development, JavaScript frameworking, and 
hardcore memeing were born of the chaos.

Where would we be today if in the 90s, web developers had been required 
to file with a centralized authority for the ability to use specific 
features on their websites?  I'm quite certain that JavaScript would 
never have made it through its first few painful years of life, and the 
web would be a very different place had this been the policy then, and I 
worry about not giving web-based VR the opportunity to reach its full 
potential by putting up technical barriers to entry early in its lifecycle.

I understand the desire to incentivize developers to switch to HTTPS for 
security reasons, but the fact is, not all content needs to be 
protected, and HTTPS really does very little if anything to protect 
users from any potential security pitfalls and bad experiences that 
these technologies may expose users to, since anyone including scammers 
can trivially acquire a valid HTTPS certificate.  The only argument in 
favor of tying this feature to HTTPS is exactly the one that worries us 
the most - "if someone is using this technology to distribute malicious 
experiences, their certificates can be revoked to protect people from 
exposure to that content."  There's a word for that, of course - censorship.

Overall it feels a bit like a digital version of Security Theater, in a 
greater argument over whether freedom of expression and anonymity are 
more or less important than our freedom to be protected from unpleasant 
things.  We understand the well-meaning intent, in an ideal world all 
web traffic would be encrypted for the protection of the user's privacy, 
but the current situation with TLS is too heavily tied with identity, 
and too cumbersome to impose on content authors.

There's a greater conversation to be had about how we can transition the 
web as a whole to secure protocols, but locking up unrelated features to 
force the issue is not the way to go.


 > Thanks!

Thank you as well, for all you've done for the community.

-- James Baicoianu

Received on Thursday, 14 July 2016 22:43:21 UTC