- From: Dan Veditz <dveditz@mozilla.com>
- Date: Wed, 03 Oct 2012 16:08:55 -0700
- To: Peter Hultqvist <phq@silentorbit.com>
- CC: public-webappsec@w3.org
On 10/2/12 9:35 AM, Peter Hultqvist wrote: > A side question would be why one choses to use HTTP headers for delivery > rather than something like a robots.txt or crossdomain.xml file. I > understand that using the header approach gives one much more fine > tuning abilities thus the cause for the rest of my questions. Blindly requesting a file that is extremely unlikely to exist on today's web led to terrible performance characteristics and extra useless load on servers. Given the grumbling about favicon.ico requests when those got added we didn't want to go there again. Mozilla's initial design did incorporate an optional header reference to a policy file to address those concerns: no extra request unless the server had already told you it existed, it kept the header compact in the face of complex policies, and although you'd take a latency hit on the first request normal browser caching could make it a performance win overall. Sites with small, simple policies were better keeping the policy in the header. That feature didn't make CSP 1.0 but as we see more real-world use of CSP we can examine how it's being used and see if such a feature would be worthwhile in practice. -Dan Veditz
Received on Wednesday, 3 October 2012 23:09:24 UTC