W3C home > Mailing lists > Public > public-web-perf@w3.org > January 2015

RE: RE: Re: add "networkDuration" to Resource Timing

From: Aaron Heady (BING AVAILABILITY) <aheady@microsoft.com>
Date: Wed, 7 Jan 2015 18:38:44 +0000
To: Ryan Pellette <ryan@catchpoint.com>, "public-web-perf@w3.org" <public-web-perf@w3.org>
Message-ID: <BN1PR03MB137042FCCA35C5EE781F011D1460@BN1PR03MB137.namprd03.prod.outlook.com>
I agree with essentially everything you said. As a site operator I'd love to have all that info. Right now though, there isn't currently a better mechanism to prevent rogue or hijacked sites from gathering sensitive information. And to block the bad sites, you have to block the good sites, at least with the current method.
Robots.txt is an unfortunate example to cite. Its default-allow is an outlier from a security prospective, and it isn't honored by the rogues either. If you don't want to be crawled then you have to implement access control. In the same manner, if you want a site to get timing information then you have to specifically allow them.
So, what we need is a better method to deal with "90% plus of the requests on a page are delivered from domains other than the site's domain."
Thoughts?
Aaron


From: Ryan Pellette [mailto:ryan@catchpoint.com]
Sent: Wednesday, January 7, 2015 10:27 AM
To: public-web-perf@w3.org
Subject: RE: Re: add "networkDuration" to Resource Timing

While I do understand the intention for having it, I am not sure is really protecting anyone.
First of all, most ad systems are tracking almost everyone on every page and doing mappings of user IDs behind the scenes and bidding for ads in market places.
Second, and more importantly, website owners need to know what URLs are on their pages and what their impact on performance is. If a website owner (say Netflix) is worried that other sites will load their content in another website (say foo.com) - we should have the header to say do not allow. In other words, don't by default block everything - similar to how robots.txt works with search bots. Otherwise what we currently have in this spec will never do what the websites need it to do; 90% plus of the requests on a page are delivered from domains other than the site's domain.
Finally, Duration would tell you if an object was cached or not. Theoretically, foo.com could create a page that makes a request behind the scenes to https://secure.netflix.com/us/layout/ecweb/common/logo-reg2x.png and by looking at the Duration, foo.com could see that the Duration was within milliseconds, so it must have come from cache.
Received on Wednesday, 7 January 2015 18:39:16 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 18:39:16 UTC