W3C home > Mailing lists > Public > public-webapps@w3.org > January to March 2010

Re: [WebTiming] HTMLElement timing

From: Jonas Sicking <jonas@sicking.cc>
Date: Thu, 4 Feb 2010 09:17:49 -0800
Message-ID: <63df84f1002040917q5cea6827p73376642bd442657@mail.gmail.com>
To: Lenny Rachitsky <lenny.rachitsky@webmetrics.com>
Cc: "public-webapps@w3.org" <public-webapps@w3.org>
On Mon, Feb 1, 2010 at 5:00 PM, Lenny Rachitsky
<lenny.rachitsky@webmetrics.com> wrote:
> I’d like to jump in here and address this point:
>
> “While I agree that timing information is important, I don't think it's
> going to be so commonly used that we need to add convenience features
> for it. Adding a few event listeners at the top of the document does
> not seem like a big burden.”
>
> I work for a company that sells a web performance monitoring service to
> Fortune 1000 companies. To give a quick bit of background to the monitoring
> space, there are two basic ways to provide website owners with reliable
> performance metrics for their web site/applications. The first is to do
> active/synthetic monitoring, where you test the site using an automated
> browser from various locations around the world, simulating a real user. The
> second approach is called passive or real user monitoring, which captures
> actual visits to your site and records the performance of those users. This
> second approach is accomplished with either a network tap appliance sitting
> in the customers datacenter that captures all of the traffic that comes to
> the site, or using the “event listener” javascript trick which times the
> client side page performance and sends it back to a central server.
>
> Each of these approaches has pros and cons. The synthetic approach doesn’t
> tell you what actual users are seeing, but it consistent and easy to
> setup/manage. The appliance approach is expensive and misses out on
> components that don’t get served out of the one datacenter, but it sees real
> users performance. The client side javascript timing approach gives you very
> limited visibility, but is easy to setup and universally available. This
> limited nature of the this latter javascript approach is the crux of why
> this “Web Timing” draft is so valuable. Website owners today have no way to
> accurately track the true performance of actual visitors to their website.
> With the proposed interface additions, companies would finally be able to
> not only see how long the page truly takes to load (including the
> pre-javascript execution time), but they’d also now be able to know how much
> DNS and connect time affect actual visitors’ performance, how much of an
> impact each image/objects makes (an increasing source of performance
> issues), and ideally how much JS parsing and SSL handshakes add to the load
> time. This would give website owners tremendously valuable data is currently
> impossible to reliably track.

Hi Lenny,

I agree that exposing performance metrics to the web page is a good
idea. I just disagree with the list of elements for which metrics is
being collected. Every element that we put on the list incurs a
significant cost to browser implementors, time that could be spent on
other, potentially more important, features. Just because something
could be useful doesn't mean that it's worth its cost.

Additionally, the more metrics that is collected, the more browser
performance is spent measuring these metrics. So there is a cost to
every one else, both authors and users, too.

/ Jonas
Received on Thursday, 4 February 2010 17:18:41 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 18:49:36 GMT