- From: Nic Jansma <njansma@microsoft.com>
- Date: Wed, 15 Dec 2010 07:27:36 +0000
- To: "public-web-perf@w3.org" <public-web-perf@w3.org>
- Message-ID: <F677C405AAD11B45963EEAE5202813BD02894C40@TK5EX14MBXW651.wingroup.windeploy.ntde>
Last week on the conf call, we discussed some concerns about using the proposed Event Log<https://dvcs.w3.org/hg/webperf/raw-file/37cffb91ba1f/specs/ResourceTiming/Overview.html#rt-performanceresourcetiming-interface> interface for Resource Timing. One of the main concerns is the overhead of collecting and storing the timing information in the user agent for all external resources, especially if this is an always-on interface. We did a small investigation to estimate the scope of this overhead. In summary, we estimate the Resource Timing Event Log interface would add around 0.005% - 0.05% overhead to the private working set in Internet Explorer on most sites. A couple months ago, Zhiheng provided us with some interesting data regarding the average number of external resources on pages and their average URI string length: * Sampled 4.3 billion pages * The average number of resources to fetch on a page is 42, with 90th percentile to be 145 and max of 1145. * The sum of resource URL length averages to be 2960 bytes, with 90th percentile to be 11.1KB and max of 664KB. A prototype of an object that would hold this timing data in Internet Explorer might look like this: * 4 bytes: pointer to structure * 2 bytes: initiatorType (unsigned short) * 4 bytes: url: DOMString pointer o URL might be possibly interned elsewhere in the user agent * 10 bytes: id: DOMString pointer + estimated average length of 6 bytes o ID might be possibly interned elsewhere in the user agent * 104 bytes: 13 timestamps @ 8 bytes each (unsigned long long) * = 124 bytes + URL string length So on the "average page": * (42 resources * 124 bytes) + 2,960 = 8,168 bytes 90th percentile page: * (145 resources * 124 bytes) + 11,100 = 29,080 bytes We loaded a few popular web sites and examined the number of external resources, and using the above prototype object estimated how much actual overhead it would add Popular News Site #1 * 163 requests * 26,410 bytes of URL * (163 * 124) + 26,410 = 46,622 bytes * IE memory usage as % of private working set: 0.045% Popular News Site #2 * 193 requests * 20,417 bytes of URL * (193* 124) + 20,417 = 44,349 bytes * IE memory usage as % of private working set: 0.041% Popular Search Engine * 7 requests * 603 bytes of URL * (7 * 124) + 603 = 1,471 bytes * IE memory usage as % of private working set: 0.007% Popular Portal * 92 requests * 11,381 bytes of URL * (92* 124) + 11,381 = 22,789 bytes * IE memory usage as % of private working set: 0.040% These are just estimates and would certainly vary depending on our final spec and how they're implemented in each user agent. However, given the estimates above, the scope of data collected by this proposed interface wouldn't add a significant amount of memory overhead (probably less than 0.05%). CPU usage required to track and maintain these structures should be around the same level of overhead as well. - Nic
Received on Wednesday, 15 December 2010 07:28:18 UTC