W3C home > Mailing lists > Public > public-webappsec@w3.org > February 2015

Re: [MIX] Require HTTPS scripts to be able to anything HTTP scripts can do.

From: Tim Berners-Lee <timbl@w3.org>
Date: Wed, 25 Feb 2015 16:03:52 +0000
Cc: Anne van Kesteren <annevk@annevk.nl>, WebAppSec WG <public-webappsec@w3.org>
Message-Id: <44B25D86-B392-4681-911C-46363AFC7430@w3.org>
To: Brad Hill <hillbrad@gmail.com>

On 2015-01 -05, at 17:55, Brad Hill <hillbrad@gmail.com> wrote:

> 
> 
> On Mon Jan 05 2015 at 3:26:59 AM Tim Berners-Lee <timbl@w3.org> wrote:
> 
>> 
>>  Data is special
>> 
>> I am a web app developer, I need to be able to access any data.
>> I am happy to and indeed want to secure the scripts and HTML and CSS which are part of my app.
>> I am happy to secure access to data which I control and serve.
>> I need to be able to access legacy insecure data like the think Linked Open Data cloud (http://lod-cloud.net/).
>> 
> 
> Are there particular obstacles to the providers of this data making it available over HTTPS or other reasons why we should assume that, over time, they will not do so? 

Yes... a huge mass of interconnected mass of linked data in which the terms, (the predicates and the classes ) are all URIs staring with "http:"

This  includes data which has been archived,  examples in academic papers, code which no-one is in  a position to change. 

There is a lot of open data in CSB and JSON as well as RDF which is served from "http:" only.
But the RDF case makes it very clear, so let us use RDF as an example. The most fundamental predicate in RDF is rdf:type, which connects something and its class.  For example when you write in N3 ot Turtle

	<https://timbl.rww.io/foo#alice>  a   <http://xmlns.com/foaf/0.1/Person> .

the language spec defines that the 'a' stands for <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> and so you meant 

	<https://timbl.rww.io/foo#alice>  <http://www.w3.org/1999/02/22-rdf-syntax-ns#type>   <http://xmlns.com/foaf/0.1/Person> .


That is simply not going to change without a massive amount of damage.  These URIs are used as identifiers not as addresses. As they should be.  <grumble>The folks who insisted on the term "URL" have a lot to answer for here.   These are identifiers, not locations.  Don't change them.  Change the way you look them up gently with time.</grumble>

It is imperative to upgrade what happens when you look up a "http:"  URI, and not require people to change to using "https:".



> Are the providers of this data actually making an effort to make it usable in client-side web platform mashups?  (e.g. setting CORS headers?)
> 

Yes and no.  There was a big push to get CORS headers added.  

	http://enable-cors.org/

Most of the sites which are actively maintained added CORS. There are a handful of holdouts, people we have not been able to reach or did not bother or did not have the authority etc.

I expect if we now ask people to roll out an upgrade of "http:" so that by default Apache 2.n+1 has it on by default, also node, etc, then we will get a reasonable uptake, but again some hold-outs. 


> I went to http://lod-cloud.net/, picked the first resource listed on the home page and loaded the example resource (http://data.linkededucation.org/resource/lak/conference/lak2013/paper/93) . It is indeed not accessible over HTTPS, but neither does it return CORS headers so would still require proxying or a native app for client-side mashups.

(Was CORS a mistake?  It has certainly been a royal pain. Should we instead have asked everyone whose data was protected implicitly by a firewall to add headers instead, or only made the CORS rules apply within  NAT nets like 192.* etc?)

Well, CORS is now a requirement for any public data. So, Brad your duty is to call the guy up and tell him. Or someone has to.

(Googling "CORS Everywhere I found https://gitlab.com/spenibus/cors-everywhere-firefox-addon/blob/master/readme.txt and laughed)


> It seems there is an educational outreach campaign needed to data providers on best practices and necessary steps to enable their data to be used in the web platform, so shouldn't that include making the data available over HTTPS alongside setting an "Access-Control-Allow-Origin: *" header?
> 

Well, the first thing is to fix browsers so if they find an ostensibly secure origin loading insecure data, they just downgrade the origin to being deemed insecure rather than blocking it.  Change the UI so that the it doesn't get the green happy certified look to it. 

Second thing is to roll out HTTP to HTTP/TLS over port 80 using connection upgrade.

That will take some time for the last 10% but for the those who are doing update cycles it will be fairly quick.

But it is less hassle and more than changing all the HTTP links.



> -Brad Hill
>    


Received on Wednesday, 25 February 2015 16:04:03 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:54:10 UTC