- From: <bugzilla@jessica.w3.org>
- Date: Mon, 25 Mar 2013 21:47:33 +0000
- To: public-html-bugzilla@w3.org
https://www.w3.org/Bugs/Public/show_bug.cgi?id=21327
Aaron Colwell <acolwell@chromium.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
Status|NEW |ASSIGNED
CC| |acolwell@chromium.org
Assignee|adrianba@microsoft.com |acolwell@chromium.org
--- Comment #2 from Aaron Colwell <acolwell@chromium.org> ---
(In reply to comment #0)
> Below are miscellaneous comments and/or questions.
>
> > The fade in coded frame equals new coded frame.
>
> What about if new coded frame duration is less than 5 ms?
This is covered in the audio splice rendering algorithm, but I will also add a
note in step 13 of the Audio Splice Frame Algorithm calling out this case. I
don't want to handle this explicitly in the algorithm because it will just make
things unnecessarily confusing.
>
> > Convert fade out samples and fade in samples to a common
> > sample rate and channel layout.
>
> The specification should mandate no sample rate conversion unless necessary.
Does this really need to be explictly called out?
>
> >Apply a linear gain fade out
>
> From where to where?
What do you mean? If this isn't clear enough please provide text that you
consider more clear. I assumed that this plus the picture in the spec was
sufficient to convey the intended meaning.
>
> > the difference between decode timestamp and last decode timestamp is greater than 100 milliseconds
>
> Why 100 ms?
It seemed like a reasonable default. It is an arbitrary value chosen to ensure
that coded frames can't be too far apart before an out-of-order append error is
signalled.
--
You are receiving this mail because:
You are the QA Contact for the bug.
Received on Monday, 25 March 2013 21:47:38 UTC