Re: Why the restriction on unauthenticated GET in CORS?

>> On Wed, Jul 18, 2012 at 4:41 AM, Henry Story <henry.story@bblfish.net> wrote:
>>
>>> 2. If there is no authentication, then the JS Agent could make the request via a CORS praxy of its choosing, and so get the content of the resource anyhow.
>>
>> Yes, the restriction on performing an unauthenticated GET only serves
>> to promote the implementation of 3rd party proxy intermediaries and,
>> if they become established, will introduce new security issues by way
>> of indirection.
>>
>> The pertinent question for cross-origin requests here is - who is
>> authoring the link and therefore in control of the request? The reason
>> that cross-origin js which executes 3rd party non-origin code within a
>> page is not a problem for web security is that the author of the page
>> must explicitly include such a link. The control is within the
>> author's domain to apply prudence on what they link to and include
>> from. Honorable sites with integrity seek to protect their integrity
>> by maintaining bona-fide links to trusted and reputable 3rd parties.
>
> yes, though in the case of a JS based linked data application, like the semi-functionaing one I wrote and described earlier
>   http://bblfish.github.com/rdflib.js/example/people/social_book.html
> ( not all links work, you can click on "Tim Berners Lee", and a few others )
> the original javascript is not fetching more javascript, but fetching more data from the web.
> Still your point remains valid. That address book needs to find ways to help show who says what, and of course not just upload any JS it finds on the web or else its reputation will suffer. My CORS proxy
> only uploads RDFizable data.


yes, i think you have run into a fundamental problem which must be
addressed in order for linked data to exist. dismissal of early
implementation experience is unhelpful at best.

i find myself in a similar situation whereby i have to write, maintain
and pay for the bandwidth of providing an intermediary proxy just to
service public requests. this has real financial consequences and is
unacceptable when there is no technical grounding for the
restrictions. as is stated before, it could even be regarded as a form
of censorship of freedom of expression for both the author publishing
their work freely and the consumer expressing new ideas.


>> On Wed, Jul 18, 2012 at 4:47 AM, Ian Hickson <ian@hixie.ch> wrote:
>>> No, such a proxy can't get to intranet pages.
>>>
>>> "Authentication" on the Internet can include many things, e.g. IP
>>> addresses or mere connectivity, that are not actually included in the body
>>> of an HTTP GET request. It's more than just cookies and HTTP auth headers.
>>
>> The vulnerability of unsecured intranets can be eliminated by applying
>> the restriction to private IP ranges which is the source of this
>> attack vector. It is unsound (and potentially legally disputable) for
>> public access resources to be restricted and for public access
>> providers to pay the costs for the protection of private resources. It
>> is the responsibility of the resource's owner to pay the costs of
>> enforcing their chosen security policies.
>
> Thanks a lot for this suggestion. Ian Hickson's argument had convinced me, but you have just provided a clean answer to it.
>
> If a mechanism can be found to apply restrictions for private IP ranges then that should be used in preference to forcing the rest of the web to implement CORS restrictions on public data. And indeed the firewall servers use private ip ranges, which do in fact make a good distinguisher for public and non public space.
>
> So the proposal is still alive it seems :-)
>

+1

i have complete support for the proposal.


>
> Social Web Architect
> http://bblfish.net/
>

Thanks,
Cameron Jones

Received on Thursday, 19 July 2012 13:22:48 UTC