- From: Silvia Pfeiffer <silviapfeiffer1@gmail.com>
- Date: Thu, 6 Oct 2011 16:36:00 +1100
On Thu, Oct 6, 2011 at 10:51 AM, Ralph Giles <giles at mozilla.com> wrote: > On 05/10/11 04:36 PM, Glenn Maynard wrote: > >> If the files don't work in VTT in any major implementation, then probably >> not many. ?It's the fault of overly-lenient parsers that these things happen >> in the first place. > > A point Philip J?genstedt has made is that it's sufficiently tedious to > verify correct subtitle playback that authors are unlikely to do so with > any vigilance. Therefore the better trade-off is to make the parser > forgiving, rather than inflict the occasional missing cue on viewers. That's a slippery slope to go down on. If they cannot see the consequence, they assume it's legal. It's not like we are totally screwing up the display - there's only one mis-authored cue missing. If we accept one type of mis-authoring, where do you stop with accepting weirdness? How can you make compatible implementations if everyone decides for themselves what weirdness that is not in the spec they accept? I'd rather we have strict parsing and recover from brokenness. It's the job of validators to identify broken cues. We should teach authors to use validators before they decide that their files are ok. As for some of the more dominant mis-authorings: we can accept them as correct authoring, but then they have to be made part of the specification and legalized. Silvia.
Received on Wednesday, 5 October 2011 22:36:00 UTC