- From: <bugzilla@jessica.w3.org>
- Date: Mon, 18 Mar 2013 10:48:14 +0000
- To: public-html-bugzilla@w3.org
https://www.w3.org/Bugs/Public/show_bug.cgi?id=20760 Simon Waller <simon.waller@samsung.com> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |simon.waller@samsung.com --- Comment #2 from Simon Waller <simon.waller@samsung.com> --- These metrics look like they have been designed to report on a software decoder. Most embedded devices (TVs, STBs) use a hardware based decoder which will be able to decode every frame it is asked to. It may be difficult for some implementations to know how many frames have been decoded since that is being done in this low level hardware and the appropriate hooks to get the information probably don't exist. For hardware decoders, the jitter is not measurable but would anyway be zero (or near enough not to be a valid quality metric). Another point to bear in mind is that the frame rate is not always constant. In the UK, the HD broadcasts on DTT dynamically switch between 50i and 25p (this is a valid thing to do and is within the video coding specifications - AVC in this case). When using hardware decoders, the quality of the video is not determined by the speed of the decoder (which is always fast enough for the video codec profile being used) but the bandwidth on the IP connections, ie it is determined by the quality of the video encoding at the maximum bitrate that the IP connection can support. BTW: Since this is my first post, a few words of introduction. My background is in STB and digital TV development but I now mainly cover standardisation activities such as DVB, OIPF and HbbTV. I will be joining the joint conf call tomorrow. -- You are receiving this mail because: You are the QA Contact for the bug.
Received on Monday, 18 March 2013 10:48:16 UTC