- From: Glenn Maynard <glenn@zewt.org>
- Date: Tue, 18 Jan 2011 16:37:00 -0500
On Tue, Jan 18, 2011 at 8:40 AM, Boris Zbarsky <bzbarsky at mit.edu> wrote: > On 1/18/11 6:09 AM, Glenn Maynard wrote: > >> I'm confused--how is the required buffer size a function of the length of >> the video? Once the buffer is large enough to smooth out network >> fluctuations, either you have the bandwidth to stream the video or you >> don't; the length of the video doesn't enter into it. >> > > The point is that many users _don't_ have enough bandwidth to stream the > video. At that point, the size of the buffer that puts you in > HAVE_ENOUGH_DATA depends on the length of the video. > If you don't have enough bandwidth, then the necessary buffer size is effectively the entire video[1]. Mikko seems to suggest that it's the entire video times some multiplier, where that multiplier can be discovered by binary searching. This doesn't make sense to me: > static time period in seconds. This is required because a 5 second > buffer could be enough for a 20 second clip but a 2 minute buffer could > be required for one hour video. In both cases, the actual available [1] (Of course, it's more precisely the size of the video minus a function of the video size, bitrate and user bandwidth--the amount of data you can leave unbuffered at the end and have it finish while you're watching. I point this out because someone else will if I don't, but I don't think it's relevant to any buffer size algorithm: it's hard to determine, and if you get it wrong for a long video you have a very annoyed user with his movie interrupted two hours in.) -- Glenn Maynard
Received on Tuesday, 18 January 2011 13:37:00 UTC