- From: Alexandre Morgaut <alexandre.MORGAUT@4d.fr>
- Date: Tue, 23 Jun 2009 14:22:11 +0200
If browsers all send the referrer HTTP header, Webmaster might always put a filter on incoming requests, But it won't prevent other webmaster to get these resources tunneling it from their own server sending a fake referrer header If you want to strictly restrict cross-domain resources, the only way would be the use of certificates, which would at least tell the visitor that some resources is going to see comes from another domain, and at most could be forbidden by the browser (Sorry, it requires SSL for now) Restricting access to resources from cross domain access also level up then the question of access it from out of any website With URL sent by mail, twitter, or anything else... -----Message d'origine----- De?: whatwg-bounces at lists.whatwg.org [mailto:whatwg-bounces at lists.whatwg.org] De la part de Aryeh Gregor Envoy??: lundi 22 juin 2009 22:50 ??: Brad Kemper Cc?: whatwg at whatwg.org; Mikko Rantalainen; www-style at w3.org Objet?: Re: [whatwg] New work on fonts at W3C On Mon, Jun 22, 2009 at 4:23 PM, Brad Kemper<brad.kemper at gmail.com> wrote: > So your argument, in effect, is that site owners should not be allowed to > restrict their content, because it might actually work? Or because older > browsers and browsers that have yet to implement the standard could be used > for the same sort of IP pirating as today? No, I'm saying that browser vendors will refuse to implement it because it will break existing sites en masse and thereby anger their users. Ideally I think it should have been possible to restrict cross-domain image loads from the start, but I don't see any way to do that *now* without breaking too many existing sites for browsers to be willing to go along with it.
Received on Tuesday, 23 June 2009 05:22:11 UTC