W3C home > Mailing lists > Public > public-webapps@w3.org > January to March 2010

Re: [WebTiming] HTMLElement timing

From: Zhiheng Wang <zhihengw@google.com>
Date: Fri, 19 Feb 2010 11:04:33 -0800
Message-ID: <802863261002191104l5d81dae5va62548877a2de97b@mail.gmail.com>
To: James Robinson <jamesr@google.com>
Cc: "lenny.rachitsky" <lenny.rachitsky@webmetrics.com>, public-webapps@w3.org
Hi, James,

On Wed, Feb 17, 2010 at 10:36 PM, James Robinson <jamesr@google.com> wrote:

> A few more questions:
>
> * What should the values of domainLookupStart/domainLookupEnd be if the DNS
> lookup was served out of cache?
>

   It should be the time before/after the retrieval of the cached entry. The
latency for that is usually quite small from what
I've seen in IE before, e.g., a msec or two, but it depends on the load of
the time.



> What about if the DNS resolution started before the fetch was initiated
> (i.e. if DNS prefetching is used or if the resource shares a domain with
> another resource that was fetched earlier)?
>

   Usually the prefetch is to get another object or just a 204, so I think
it makes sense to treat future DNS lookup as from cache?

   A question for browser gurus: is the connection reuse based on domain or
ip? I presume it's on domain but I'd like to make sure.



>
> * The specification requires that "The granularity and accuracy of the
> timing-related attributes in the DOMTiming and navigationTiming interface
> must be no less than one millisecond."  This is not generally possible on
> Windows due to the inaccuracy of system-provided timing APIs.  Could you
> relax this requirement so that it's possible to implement a compliant UA on
> all systems?
>

    Thanks for bring this up. By saying that, I was hoping some sort of
fine-granularity timer was provided on window rather than the
system clock. But if that proves to be impossible or inefficient, dropping
it sounds like the way to go.



>
> * What precisely does 'parse' time mean for each element?  For example, on
> a <script> tag does parse time include parsing the script itself, or
> executing it as well?  What about for JS engines that do not distinguish
> between the two?
>

   It doesn't mean to include js execution but rather the time on parsing
the response and creating the dom object. Could you pls point
me to more details about JS engines that mix parsing and executing? I've
been thinking adding JS execution time and that should be
something interesting.

   Thanks for the input and making changes accordingly...

cheers,
Zhiheng



>
> - James
>
>
> On Thu, Feb 18, 2010 at 5:07 PM, Zhiheng Wang <zhihengw@google.com> wrote:
>
>>
>>     FYI, I just made some minor updates to the draft based on the
>> discussion, like removing Ticks() and
>> narrowing down the list of elements that should provide the DOMTiming
>> interface. I am going to throw
>> in more details shortly as well.
>>
>> thanks,
>> Zhiheng
>>
>> On Tue, Feb 2, 2010 at 1:09 PM, lenny.rachitsky <
>> lenny.rachitsky@webmetrics.com> wrote:
>>
>>> I’d like to jump in here and address this point:
>>>
>>> “While I agree that timing information is important, I don't think it's
>>> going to be so commonly used that we need to add convenience features
>>> for it. Adding a few event listeners at the top of the document does
>>> not seem like a big burden.”
>>>
>>> I work for a company that sells a web performance monitoring service to
>>> Fortune 1000 companies. To give a quick bit of background to the
>>> monitoring
>>> space, there are two basic ways to provide website owners with reliable
>>> performance metrics for their web site/applications. The first is to do
>>> active/synthetic monitoring, where you test the site using an automated
>>> browser from various locations around the world, simulating a real user.
>>> The
>>> second approach is called passive or real user monitoring, which captures
>>> actual visits to your site and records the performance of those users.
>>> This
>>> second approach is accomplished with either a network tap appliance
>>> sitting
>>> in the customers datacenter that captures all of the traffic that comes
>>> to
>>> the site, or using the “event listener” javascript trick which times the
>>> client side page performance and sends it back to a central server.
>>>
>>> Each of these approaches has pros and cons. The synthetic approach
>>> doesn’t
>>> tell you what actual users are seeing, but it consistent and easy to
>>> setup/manage. The appliance approach is expensive and misses out on
>>> components that don’t get served out of the one datacenter, but it sees
>>> real
>>> users performance. The client side javascript timing approach gives you
>>> very
>>> limited visibility, but is easy to setup and universally available. This
>>> limited nature of the this latter javascript approach is the crux of why
>>> this “Web Timing” draft is so valuable. Website owners today have no way
>>> to
>>> accurately track the true performance of actual visitors to their
>>> website.
>>> With the proposed interface additions, companies would finally be able to
>>> not only see how long the page truly takes to load (including the
>>> pre-javascript execution time), but they’d also now be able to know how
>>> much
>>> DNS and connect time affect actual visitors’ performance, how much of an
>>> impact each image/objects makes (an increasing source of performance
>>> issues), and ideally how much JS parsing and SSL handshakes add to the
>>> load
>>> time. This would give website owners tremendously valuable data is
>>> currently
>>> impossible to reliably track.
>>>
>>>
>>> Lenny Rachitsky
>>> Webmetrics
>>>
>>>
>>> James Robinson-5 wrote:
>>> >
>>> > On Tue, Feb 2, 2010 at 10:36 AM, Zhiheng Wang <zhihengw@google.com>
>>> wrote:
>>> >
>>> >> Hi, Olli,
>>> >>
>>> >> On Fri, Jan 29, 2010 at 6:15 AM, Olli Pettay
>>> >> <Olli.Pettay@helsinki.fi>wrote:
>>> >>
>>> >>>  On 1/27/10 9:39 AM, Zhiheng Wang wrote:
>>> >>>
>>> >>>> Folks,
>>> >>>>
>>> >>>>      Thanks to the much feedback from various developers, the
>>> WebTiming
>>> >>>> specs has undergone some
>>> >>>> major revision. Timing info has now been extended to page elements
>>> and
>>> >>>> a
>>> >>>> couple more interesting timing
>>> >>>> data points are added. The draft is up on
>>> >>>> http://dev.w3.org/2006/webapi/WebTiming/
>>> >>>>
>>> >>>>      Feedback and comments are highly appreciated.
>>> >>>>
>>> >>>> cheers,
>>> >>>> Zhiheng
>>> >>>>
>>> >>>
>>> >>>
>>> >>> Like Jonas mentioned, this kind of information could be exposed
>>> >>> using progress events.
>>> >>>
>>> >>> What is missing in the draft, and actually in the emails I've seen
>>> >>> about this is the actual use case for the web.
>>> >>> Debugging web apps can happen outside the web, like Firebug, which
>>> >>> investigates what browser does in different times.
>>> >>> Why would a web app itself need all this information? To optimize
>>> >>> something, like using different server if some server is slow?
>>> >>> But for that (extended) progress events would be
>>> >>> good.
>>> >>> And if the browser exposes all the information that the draft
>>> suggest,
>>> >>> it
>>> >>> would make sense to dispatch some event when some
>>> >>> new information is available.
>>> >>>
>>> >>
>>> >>    Good point and I do need to spend more time on the intro and use
>>> cases
>>> >> throughout
>>> >> the specs. In short, the target of this specs are web site owners who
>>> >> want
>>> >> to benchmark their
>>> >> user exprience in the field. Debugging tools are indeed very powerful
>>> in
>>> >> development but things
>>> >>  could become quite different once the page is put to the wild, e.g.,
>>> >> there
>>> >> is no telling
>>> >> about dns, tcp connection time in the dev space; UGC only adds more
>>> >> complications to the
>>> >> overall latency of the page; and, "what is the right TTL for my dns
>>> >> record
>>> >> if I want to maintain
>>> >> certain cache hit rate?", etc.
>>> >>
>>> >>
>>> >>> There are also undefined things like paint event, which is
>>> >>> referred to in lastPaintEvent and paintEventCount.
>>> >>> And again, use case for paintEventCount etc.
>>> >>>
>>> >>
>>> >>    Something like Mozilla's MozAfterPaint?  I do need to work on more
>>> use
>>> >> cases.
>>> >>
>>> >
>>> > In practice I think this will be useless.  In a page that has any sort
>>> of
>>> > animation, blinking cursor, mouse movement plus hover effects, etc the
>>> > 'last
>>> > paint time' will always be immediately before the query.   I would
>>> > recommend
>>> > dropping it.
>>> >
>>> > - James
>>> >
>>> >
>>> >>
>>> >>>
>>> >>> The name of the attribute is very strange:
>>> >>> "readonly attribute DOMTiming document;"
>>> >>>
>>> >>
>>> >>    agreed... how about something like "root_times"?
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> What is the reason for timing array in window object? Why do we need
>>> to
>>> >>> know anything about previous pages? Or what is the timing attribute
>>> >>> about?
>>> >>>
>>> >>
>>> >>   Something got missing in this revision, my bad. The intention is to
>>> >> keep
>>> >> previous pages' timing info only if these pages
>>> >> are all in a direction chain. From the user's perspective, the waiting
>>> >> begins with the fetching of the first page in a
>>> >> redirection chain.
>>> >>
>>> >>
>>> >> thanks,
>>> >> Zhiheng
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>>
>>> >>> -Olli
>>> >>>
>>> >>
>>> >>
>>> >
>>> >
>>>
>>> --
>>> View this message in context:
>>> http://old.nabble.com/-WebTiming--HTMLElement-timing-tp27335161p27427432.html
>>> Sent from the w3.org - public-webapps mailing list archive at
>>> Nabble.com.
>>>
>>>
>>>
>>>
>>
>
Received on Friday, 19 February 2010 19:05:10 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 18:49:37 GMT