- From: Tab Atkins Jr. <jackalmage@gmail.com>
- Date: Thu, 31 Dec 2009 08:33:33 -0600
- To: Julian Reschke <julian.reschke@gmx.de>
- Cc: Silvia Pfeiffer <silviapfeiffer1@gmail.com>, Anne van Kesteren <annevk@opera.com>, Philip Jägenstedt <philipj@opera.com>, "Edward O'Connor" <hober0@gmail.com>, Jeremy Keith <jeremy@adactio.com>, HTMLwg <public-html@w3.org>
On Thu, Dec 31, 2009 at 7:20 AM, Julian Reschke <julian.reschke@gmx.de> wrote: > Tab Atkins Jr. wrote: >> ... >> >> It's really not what Gruber mentioned, though. He, and the rest of >> us, don't want a page with 3+ (or 50+) <video>s to start buffering all >> of them. That's obviously bad. But grabbing a little bit of >> information from each, to determine intrinsic ratios and duration? >> That's both fine, and roughly on par with downloading an image. We're >> fine with the performance of 50+ images on a page, so why do we need >> the ability to control <video> any more carefully? >> ... > > Potentially because of servers without proper partial request (byte range) > support... That's fair. I wouldn't know how much, if any, of a problem this is though. No one's mentioned it as an issue with implementations so far. Firefox devs, is this a problem with your current bandwidth-conservative approach? ~TJ
Received on Thursday, 31 December 2009 14:34:01 UTC