W3C home > Mailing lists > Public > public-tt@w3.org > May 2014

Re: Draft TTML Codecs Registry - Issue-305

From: Nigel Megitt <nigel.megitt@bbc.co.uk>
Date: Tue, 20 May 2014 14:34:57 +0000
To: Glenn Adams <glenn@skynav.com>
CC: Michael Dolan <mdolan@newtbt.com>, TTWG <public-tt@w3.org>
Message-ID: <CFA12339.1DF3E%nigel.megitt@bbc.co.uk>

On 20/05/2014 15:06, "Glenn Adams" <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:

On Tue, May 20, 2014 at 10:07 PM, Nigel Megitt <nigel.megitt@bbc.co.uk<mailto:nigel.megitt@bbc.co.uk>> wrote:
On 20/05/2014 00:25, "Glenn Adams" <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:
On Mon, May 19, 2014 at 10:54 PM, Nigel Megitt <nigel.megitt@bbc.co.uk<mailto:nigel.megitt@bbc.co.uk>> wrote:
On 19/05/2014 12:35, "Glenn Adams" <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:

On Mon, May 19, 2014 at 8:15 PM, Nigel Megitt <nigel.megitt@bbc.co.uk<mailto:nigel.megitt@bbc.co.uk>> wrote:
On 19/05/2014 12:05, "Glenn Adams" <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:
On Mon, May 19, 2014 at 6:27 PM, Nigel Megitt <nigel.megitt@bbc.co.uk<mailto:nigel.megitt@bbc.co.uk>> wrote:
On 18/05/2014 02:39, "Glenn Adams" <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:

On Fri, May 16, 2014 at 2:51 AM, Nigel Megitt <nigel.megitt@bbc.co.uk<mailto:nigel.megitt@bbc.co.uk>> wrote:
On 15/05/2014 23:45, "Glenn Adams" <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:

Could you cite the exact documents/sections that you are referring to by "quoted text"?

I was referring to the text from ISO/IEC 14496-12, AMD2 that Mike included in his email.

I assume you refer to:


>From 14496-12, AMD2:

namespace is a null-terminated field consisting of a space-separated list, in UTF-8 characters, of
one or more XML namespaces to which the sample documents conform. When used for metadata,
this is needed for identifying its type, e.g. gBSD or AQoS [MPEG-21-7] and for decoding using XML
aware encoding mechanisms such as BiM.

schema_location is an optional null-terminated field consisting of a space-separated list, in UTF-
8 characters, of zero or more URL’s for XML schema(s) to which the sample document conforms. If
there is one namespace and one schema, then this field shall be the URL of the one schema. If there
is more than one namespace, then the syntax of this field shall adhere to that for xsi:schemaLocation
attribute as defined by [XML]. When used for metadata, this is needed for decoding of the timed
metadata by XML aware encoding mechanisms such as BiM.

This tells me nothing of why one would want to signal content profile or why one would want to communicate namespace usage separately (from XMLNS declarations found in the document).

The common text "When used for metadata, this is needed for … decoding using XML aware encoding mechanisms such as BiM." in both paragraphs appears to indicate the intended purpose of providing the information. Arguably in our case we don't need a generic XML processing feature in which case perhaps no value for this attribute is needed at all. That doesn't seem to be the case though given the level of interest in this topic.


The processing behaviour may or may not be expressed in terms of TTML1-style profile features. There's no other language other than prose available for this purpose (that I know of).

If a specification defines processing semantics that must be supported in order for a processor to conform to the specification, and if that specification does not define any feature/extension, then I would firstly view that as a broken specification; however, another potential interpretation is that the specification implies an otherwise unnamed feature/extension whose feature/extension designation corresponds to the profile designation. That is, the profile designation serves as a high-level, un-subdivided designation of the set of semantics mandated by compliance with the defined profile.

Concerning 'broken' I note also TTML1SE §3.3 [1] does require an implementation compliance statement (ICS) to support claims of compliance – it would seem reasonable to require this as an input to the registration process. Or in TTML2 weaken this requirement.

[1] http://www.w3.org/TR/ttml1/#claims

This might be a way out of this without having to have such specifications define individual, fine-grained feature/extension designations.

Yes, that would be helpful to lower the barrier to entry.

Anyway, I'm still waiting for a someone to articulate a use case for signaling a content profile, or any aspect of a content profile (e.g., namespaces, schemas).

Did Mike's email including the relevant sections from 14496-12 not do this?

No, it does not. I repeat, signaling content profile can only have two purposes in the context of decoding/processing as far as I can tell:

(1) to validate incoming document, which is not yet done by any TTML processor, though we are looking at adding a @validation attribute in TTML2 that could be used to require this;

(2) to imply a processor (decoder) profile in lieu of explicitly signaling of a processor profile;

(3) to support generic XML processing functionality not specific to TTML.

Then it would seem to be outside the scope of TTML.

If we define an alternative to the generic set of {namespace, schemaLocation} then we need to explain how this requirement can be met otherwise we'll potentially break that generic requirement.

I don't think so. First, I don't agree there is a generic requirement that we need to solve. Or I haven't seen a clear explanation of such requirement or why we need to address it. [Assuming for a moment you are talking about namespaces, and not talking about the other discussion about MIME types.]

It seems clear to me from reading the ISO 14496 excerpt that this generic requirement exists for those who specify values to go in the @codecs parameter, which is what we're being asked to do to improve TTML interoperability in an ISO Base Media File Format context. Perhaps someone closer to that spec can explain if this is in fact the case?

It may be outside the scope of TTML but not outside the scope of what we've been asked to solve.

In the context of the current thread, it seems only the second of these is potentially relevant. However, I have to ask why one wouldn't simply signal a processor profile instead of using a more complex process of signaling a content profile and then having the decoder/processor infer a processor profile from that content profile.

My proposal is to signal a specification not a content profile.

I don't know what that means.

I defined it in my previous email on this thread at [1], also copied in to this email below, at "Thu, May 15, 2014 at 1:28 PM".


I read that email, but it did nothing to specify the problem that folks are attempting to solve. For example, it does not indicate how namespaces are related to content profiles.

"The content profile shall itself reference one or more namespaces and schema locations. " to quote myself.

My position is that namespaces are not related to content profiles, and the former say nothing about the latter (in general). I'm not even sure there is a minimal mapping to a content profile that can be inferred from a set of namespace definitions.


So, if someone is suggesting that a list of namespaces serve as a shorthand for a content profile, I'm just not seeing it.


This meets both the stated requirement in 14496-12 of identifying set of {namespace, schema} and the requirement to direct the choice of processor profile.

I don't think so.

So you don't think that a proposal that you don't understand does something. I don't know what to do with that information!

If we are going to stop moving around in circles, I need to see a simple, complete problem statement. I have yet to see one.


The closest I have at the moment (synthesised from what others have written) is:

  *   Define a set of short labels and combinatorial operators to be used in the first instance as extensions to the stpp.ttml prefix as used in the ISO/IEC 14496-12 @codecs parameter and as a suffix to the application/ttml+xml MIME type, and therefore conforming to any requirements defined by the specifications for those.

I agree with the part up to, but not including "and therefore conforming to...".

Well that's out of our control – in case I wasn't clear, I meant only that anything that we put into the @codecs parameter or MIME type must conform to the specifications for the @codecs parameter and for the MIME type.

  *   ISO/IEC 14496-12 requires signalling of the set of {namespace, schemaLocation} used by XML documents.

I'm not familiar with this spec or why it wants to signal namespaces. I don't know why it wants such a list. In any case, this seems something outside the scope of the TTML spec itself, in the realm of "how does 14496-12 deal with a particular XML format", which seems a general question, and completely unrelated to the discussion of profiles.

Yes it does seem to be a general question re XML – and hopefully following David's email we can remove this as a requirement. Note though that the request is not necessarily for profiles as currently drafted for TTML1 or TTML2 but for something which we would seek to map to/from TTML profiles. In other words profiles may or may not be the solution to whatever the problem is.

  *   A receiving system must be able to to make processing choices based on the values prior to parsing a TTML document.

Must is a bit too strong. Should is better. There is nothing that prevents a receiver from fully decoding TTML before doing anything else, though it may be inefficient.

That's a misreading I think – I didn't say a receiving system must make processing choices, just that the solution must work for those that wish to make processing choices based on the parameter value.

I'd separately be comfortable with a requirement that receiving systems should use the externally represented conformance information to make choices, but I think that's out of our scope.

  *   It must be possible to specify receiving systems in terms of the parameter values that must minimally be supported.

Again, should is better than must. By "parameter values" here are you referring to MIME Content-Type parameters? If so, then I would characterize this as a first-order approximation to answering the question - can I process? This might return true, but during the actual parsing it may turn into false.

Again, it's not in our scope to mandate whether or not receiving systems are specified in terms of the parameter values.

If a parameter describes an 'offer' and the processor accepts that but then stops processing part way through I see that as a broken implicit contract. That kind of behaviour would be hugely unhelpful for users, so I believe we have a MUST requirement that the parameter values can be used to identify processors that are able to process the document being described. The exception would be if the claim for conformance made in the parameter value turns out not to be true upon deep inspection of the document.

This latter is what I'm referring to. An external parameter value will never be able to capture the richness of the existing TTML1 profile mechanism, which allows specifying a profile inline, possibly based on an existing profile (via 'use' attribute) or defined tabula rasa. As such, an external parameter that lists the same profile as the 'use' parameter might be construed as being supported, but then only by parsing the profile elements in the document discover it is not supported, e.g., if the document requires support for a random, unknown feature not required by the baseline profile listed in the external parameter.

If processing requires support for a feature not listed in the dereferenced external profile parameter then IMHO that parameter has the wrong value. Or in my offer/contract terminology it's not a true (fair?) offer.

  *   It should be possible for those who wish, to be able to validate documents against the conformance claims made for them by the value of this parameter. (I think this is a weaker requirement than the others because this would necessarily require parsing the document so it is reasonable to use features internal to the document for validation.)

If you are referring to an external parameter on Content-Type, then I would not agree. There is no need to base validation processing on an external parameter that signals content profile. Only a document internal parameter, e.g., ttp:contentProfile, should be used for this purpose for a simple reason: external data is more likely to be wrong that internal data.

Given that 'validation' is a subset of 'processing' then logically if we support 'can process' choice based on the parameter value then we must also support 'can validate' choice on the same basis – they are identical if the kind of processor is a validator.

I am explicitly treating them as different categories, and that validation would (if used) precede 'processing' in the normal sense of 'content processor' processing, e.g., presentation or transformation processing.

Thanks for clarifying. I'm still struggling to spot the difference though: if features from Grandma's-Homecooked-TT aren't known to Uncle's-Special-Recipe-TT validator then it won't be validated successfully even if it's otherwise conformant.

  *   Content providers should be able to derive acceptable values for this parameter given a previously unknown document, but may not need to if they know something already about how the document was created.

Where is this requirement coming from? What is the use case?

The use case is when there is a known predefined chain of author -> distribute -> process, where each step has external knowledge of the supported content and processor profiles so it's straightforward to add the relevant parameter value. This is usually the case in closed A/V environments such as those separately referenced by John. In this normal state content providers know enough about how the document was created to state the acceptable values. However if a document of unknown provenance needs to be fed into this chain, the content provider should be able to derive acceptable parameters by looking at the document.

I'd accept that this is a very weak requirement and removing it would probably make no meaningful difference.

The content provider may not want the receiving system to infer 'any possible' processor profile from the content profile but instead describe 'must be one of these' processor profiles.

Sure, in which case the content author should reference or specify a processor profile in the document, in which case signaling a content profile has no purpose.

That's fine but out of scope of the discussion, which is about what is specified externally to the document.

If there are other reasons for signaling content profile (in the context of the current thread) then I haven't seen them articulated.

On Thu, May 15, 2014 at 1:28 PM, Nigel Megitt <nigel.megitt@bbc.co.uk<mailto:nigel.megitt@bbc.co.uk>> wrote:
Since namespaces and schemas define and constrain document contents without defining processing behaviour the quoted text defines a content profile declaration. It isn't asking for anything concerning specific processor capabilities but is merely describing  the contents of the document. The information may be used for downstream processing by context aware processors. The reference to namespace-aware compression makes clear that the mapping from whatever label scheme we choose to namespaces and schemas is important.

However it's clear that we expect the receiving system to use the information to direct its processing, as described previously.

Consider that the specification of a TTML variant x consists of the union of a content profile Cx and a description of processing behaviour Bx, which I'll express as S = C + B. The content profile shall itself reference one or more namespaces and schema locations. The processing behaviour may or may not be expressed in terms of TTML1-style profile features. There's no other language other than prose available for this purpose (that I know of).

It is possible to define two specifications S1 and S2 where S1 = Cx + Bx and S2 = Cx + By, i.e. the same contents are processed with different behaviour. By the quoted text there is no need to differentiate between them from an ISO 14496 perspective. However we understand from our knowledge of the problem space that it may be useful to signal to a receiving system which behaviour set is desirable. And it may be helpful in a receiving system to differentiate between the available behaviours in order to provide the optimal experience.

Would it be contrary to the spirit of the ISO wording to assign short labels each corresponding to some Specification, and for receiving systems to be expected to dereference (using a cached lookup table!) from those labels to the namespaces and schema locations contained within that specification's content profile? This would satisfy the ISO requirements and permit us to signal additionally the processor features and behaviours. At this stage the expression of those is not our concern – just that there is a document somewhere that describes how the implementation should work.

Going back to the previous example, if a document conforms to Cx then it could be signalled either as S1 or S2 or both, and if the content provider has verified that presentation will be acceptable either way then both S1 and S2 would be declared, otherwise just one of them (or neither if there's some other Sn that also uses Cx).

With this scheme combinatorial logic wouldn't really make sense – you could infer something about unions and intersections of content profiles but since the language used to describe processor behaviours can't be mandated (okay it could in theory, but it wouldn't be accepted in practice) it wouldn't be a well defined operation. Incidentally this is in no way a critique of the effort put in by Glenn, and its outcomes, in terms of defining content and processor profiles – though it might be nice to verify that this simple expression can be expanded into that scheme should a specification writer choose to do so.

This implies that every combination of content profiles and behaviours must be considered carefully and registered as a new specification with a new label. It also implies that if a document declares conformance with a set of specifications then it must conform to every member of the set of content profiles and it may be processed according to any one of the set of processing behaviours.

The expression of that set is as described previously, where we pick our favourite delimiter out of a hat made out of ampersands.

Also: this topic was discussed in summary briefly on the call today and a new suggestion arose, that some guidance for 'reasons why the TTWG would reject an application for registration' would be helpful. When requiring combinations to be registered separately there's a greater need to ensure that the registration process is quick and painless, and this guidance would help us and those who may follow to expedite it.


On 15/05/2014 18:00, "Michael Dolan" <mdolan@newtbt.com<mailto:mdolan@newtbt.com>> wrote:

I believe the problem statement is to replace the potentially unwieldy long strings in the namespace & schema_location fields defined in 14496-12 and 14496-30 with a more compact string suitable for the DASH manifest codecs field.

>From 14496-12, AMD2:

namespace is a null-terminated field consisting of a space-separated list, in UTF-8 characters, of
one or more XML namespaces to which the sample documents conform. When used for metadata,
this is needed for identifying its type, e.g. gBSD or AQoS [MPEG-21-7] and for decoding using XML
aware encoding mechanisms such as BiM.

schema_location is an optional null-terminated field consisting of a space-separated list, in UTF-
8 characters, of zero or more URL’s for XML schema(s) to which the sample document conforms. If
there is one namespace and one schema, then this field shall be the URL of the one schema. If there
is more than one namespace, then the syntax of this field shall adhere to that for xsi:schemaLocation
attribute as defined by [XML]. When used for metadata, this is needed for decoding of the timed
metadata by XML aware encoding mechanisms such as BiM.

I’m warming up to the idea of requiring TTML content profiles be created for the combinations.


From: Glenn Adams [mailto:glenn@skynav.com]
Sent: Thursday, May 15, 2014 9:15 AM
To: Nigel Megitt
Cc: Michael Dolan; TTWG
Subject: Re: Draft TTML Codecs Registry

My understanding from Dave was that the problem is how to answer the following method:

boolean canPlay(String contentTypeWithParameters)

I have not seen any statement of a problem that relates to signaling content conformance.

As for requirements driving the ability to express a combination of profiles, we already have (in TTML1) and will have more (in TTML2) that permits a user to characterize processing requirements by means of a combination of existing profiles. Consequently, any shorthand signaling of first-order processor support needs to be able to repeat the expression of such combinations.

I don't buy any "its too complex" argument thus far, primarily because nobody has stated what is (overly) complex in sufficient detail to understand if there is a problem or not.

My perception of the TTML profile mechanism is that it is easy to understand and implement, and, further, that it is a heck of lot easier to understand and implement than XML Schemas.

On Thu, May 15, 2014 at 9:58 AM, Nigel Megitt <nigel.megitt@bbc.co.uk<mailto:nigel.megitt@bbc.co.uk>> wrote:
Agreed there's a gulf of understanding/expectation that we need to bridge.

Can anyone volunteer to draft a set of requirements for this functionality, in the first instance being the smallest set needed to meet the ISO specs? (Mike, I guess I'm thinking of you, following our discussion at the weekly meeting earlier)

On 15/05/2014 16:48, "Glenn Adams" <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:

i can see this subject is not going to be resolved easily as we clearly have a large gap about requirements; e.g., i think there are no requirements to signal content conformance, but only client processor requirements, i think we must use the TTML profile mechanism, etc

On Thursday, May 15, 2014, Michael Dolan <mdolan@newtbt.com<mailto:mdolan@newtbt.com>> wrote:
Maybe "highly undesirable", but if we don't address the A + B signaling
explicitly, then profiles need to be created for all the combinitorics of
namespaces in practice. Not the end of the world, but virtually prevents the
simple signaling of 3rd party namespaces already provided by the
namespace/schemaLocation mechanism today. No I am not proposing we use that
- I am pointing out a deficiency in this proposal that we already address
today in 14496.

Anyway, we need to go through the points in my email a week ago - if not
today, then on the 29th.


-----Original Message-----
From: David Singer [mailto:singer@mac.com]
Sent: Thursday, May 15, 2014 5:20 AM
To: Glenn Adams
Subject: Re: Draft TTML Codecs Registry


Though it will be a sub-parameter of the codecs parameter for the MP4 file
type, from the point of view of TTML it's actually a profile short name
registry rather than codecs registry, so I think we should rename it.

the values here should be usable in both
a) the profiles parameter for the TTML mime type
b) the codecs parameter for the MP4 mime type

so, also "named codecs" -> "named profiles"

I agree with Cyril that we only need a single operator here (implement one
of these profiles and you're good to go), both because we don't need the
complexity, and because a "implement both/all of these" is effectively
inviting file authors to make up new profiles ("to process this document you
have to implement both A and B"), which is (IMHO) highly undesirable.

On May 15, 2014, at 9:55 , Glenn Adams <glenn@skynav.com<mailto:glenn@skynav.com>> wrote:

> See [1].
> [1] https://www.w3.org/wiki/TTML/CodecsRegistry

Dave Singer



This e-mail (and any attachments) is confidential and may contain personal views which are not the views of the BBC unless specifically stated.
If you have received it in error, please delete it from your system.
Do not use, copy or disclose the information in any way nor act in reliance on it and notify the sender immediately.
Please note that the BBC monitors e-mails sent or received.
Further communication will signify your consent to this.

Received on Tuesday, 20 May 2014 14:35:37 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:43:35 UTC