- From: Mark Watson <watsonm@netflix.com>
- Date: Thu, 10 May 2012 03:31:42 +0000
- To: Arvind Jain <arvind@google.com>
- CC: "public-web-perf@w3.org" <public-web-perf@w3.org>, "public-web-and-tv@w3.org WG" <public-web-and-tv@w3.org>
- Message-ID: <44AB9F9E-55D2-4317-AB1F-AA5428886B1C@netflix.com>
On May 9, 2012, at 5:12 PM, Arvind Jain wrote: As of the spec now, we will look at the resource as a whole. Do you mean that if multiple byte range requests are issued for a single URL then there will be only one PerformanceResourceTiming object ? How would that work, given that each byte range request goes through the full resource request lifecycle: DNS, TCP, LS, HTTP(S) etc. ? Would the PerformanceResourceTiming object refer just to the last download ? Last to start or last to complete ? Agree we need to think about this case. This is a good topic for our next call. When is the call ? I couldn't find it on the working group web pages. I'll try and join. …Mark Arvind On Wed, May 9, 2012 at 11:43 AM, Mark Watson <watsonm@netflix.com<mailto:watsonm@netflix.com>> wrote: On May 9, 2012, at 11:30 AM, Arvind Jain wrote: Thank Mark. It certainly makes sense to include performance of media elements into the Timing specifications. ResourceTiming will shortly be in CR so it's best to put together a new spec for this use case. We could either do this in ResourceTiming2 spec (no work has started on that front yet) or have a MediaResourceTiming specification (probably better option).' The existing spec says in 4.1 that it already applies to media elements. I would argue that the lack of byte ranges in the name is a flaw already, since media elements commonly issue byte range requests, for example when seeking. Re. your questions, the PerformanceResourceTiming object is made available after the resource has been downloaded. So, then, I guess it would be a significant change to make the objects available earlier and provide "progress" type information. Perhaps that could be a feature for the next version ? …Mark In case of http pipelining, you are right, the pipelining delay will be included in (responseStart - requestStart). Wonder what folks think of adding an indicator re. pipelining. Arvind On Wed, May 9, 2012 at 10:31 AM, Mark Watson <watsonm@netflix.com<mailto:watsonm@netflix.com>> wrote: Hi WebPerf, It's been suggested in the HTML WG that measurement of download performance for <video> and <audio> media elements should be covered by the WebPerf work. These measurements are of interest to the Web & TV interest group as they could be used to drive video bitrate adaptation, in conjunction with the Media Source Extension [1] proposed for the media elements. In the current version of that extension, it would be XHR that is used to fetch the media. In a future version the media element may be provided with a sequence of ( URL, byte range ) pairs and so it would be the media element doing the fetching. I took a look at the Resource Timing draft [2] and this looks like it has a lot of great information that could be used for this purpose. Five things sprang to mind, though: - Adaptive streaming players frequently issue byte range requests. So, URL alone is not enough to identify the request. Should there be a syntax for the name of a resource timing entry for a byte range request which includes the byte range in the name ? - It might be nice to have an easy way to collect the PerformanceResourceTiming objects for a given audio or video element. For example if there were an event that fired on the element each time one was created (when the request first starts). - For adaptive streaming, the performance information needs to be available as the resource is being downloaded. I couldn't find in the document whether this was the intention ? Also, the expected behavior with respect to redirects is not completely clear: the UA doesn't know if there will be a redirect until the response arrives, so before that it would populate fetchStart, requestStart, responseStart etc. Should it clear all those when it sees a redirect and has to start again ? - It would be good to have information about download progress - for example a bytesReceived field which returns the number of bytes received at the time it is read (this is what was proposed for the <video> element in the HTML bug referenced in the subject line [3]). - UAs might use HTTP pipelining. In this case the difference between requestStart and responseStart is not the RTT + server response time that it usually is: it includes the time required to complete the previous requests on the connection. It would be good at least to signal when this is the case: i.e. when the request has been pipelined. What is the status of the Resource Timing work ? Is it possible to consider the above comments for the current draft ? Or should I address these comments for a future version ? (is one planned?). Best regards, Mark Watson Netflix [1] http://dvcs.w3.org/hg/html-media/raw-file/tip/media-source/media-source.html [2] http://w3c-test.org/webperf/specs/ResourceTiming/ [3] https://www.w3.org/Bugs/Public/show_bug.cgi?id=12399 Begin forwarded message: From: <bugzilla@jessica.w3.org<mailto:bugzilla@jessica.w3.org>> Subject: [Bug 12399] <video> add bytesReceived, downloadTime, and networkWaitTime metrics Date: May 8, 2012 7:29:48 PM PDT To: <watsonm@netflix.com<mailto:watsonm@netflix.com>> https://www.w3.org/Bugs/Public/show_bug.cgi?id=12399 --- Comment #42 from Silvia Pfeiffer <silviapfeiffer1@gmail.com<mailto:silviapfeiffer1@gmail.com>> 2012-05-09 02:29:47 UTC --- (In reply to comment #40) Silvia: So to confirm, you're agreeing that bytesReceived, downloadTime, and networkWaitTime are not media-specific and that we should move them to a WebPerf API rather than HTMLMediaElement? I've spoken with some others and we agree that these three are not media-specific and could be progressed in WebPerf. We have also identified that a generic DroppedFrames measure for video is important so Web Devs can get information about how good the quality of playback is that the users are seeing. It basically signals how much "system bandwidth" is available for video. Web Devs can gather these stats to make a better informed decision on which bitrate resource to choose for start of the next video's playback, they can switch to alternative lower bitrate resources mid-stream, or inform the user to close other apps, and build a profile of typical bandwidth cases to decide on which bitrates to encode resources into. The DroppedFrames metric is already available in WebKit through the webkitDroppedFrames attribute and in Firefox through (mozPaintedFrames - mozParsedFrames). -- Configure bugmail: https://www.w3.org/Bugs/Public/userprefs.cgi?tab=email ------- You are receiving this mail because: ------- You are on the CC list for the bug.
Received on Thursday, 10 May 2012 03:32:15 UTC